From the Microwave Journal November 2008 Supplement.

While a large majority of owners use their mobile phones for voice calls and short message services (texting) only, a growing number are using bandwidth-hungry applications such as Web browsing, music downloads and streamed video. The current explosion in wireless data use has been fuelled partly by the introduction of Apple’s iPhones. While other so-called smartphones with similar capability have been available for years, the different perspective of Apple’s advertising—that of a computer company showcasing its “whole product” rather than a phone manufacturer promoting a single feature that differentiates their new product, such as a better music player or higher resolution camera—has excited subscribers and driven up data revenues for all network operators (and not just those selling iPhones). In addition, operators will look for additional revenues from mobile advertising, which is forecast to grow to a multi-billion dollar business over the next few years, and which will come to depend on higher bandwidth services for fulfillment.

This article addresses some of the issues and challenges facing the wireless industry—from chipset providers to network operators, specifically from the perspective of a test equipment supplier. To service these needs and to match the speeds users experience on a home PC with a broadband connection (either ADSL or cable modem), mobile network operators have been continuously investing in technology upgrades to remain competitive.

Long Term Evolution (LTE) is the project name of a new air interface for wireless access being developed by the Third Generation Partnership Project (3GPP), aimed at evolving 3GPP’s third generation system towards an all-IP network optimized for high speed data transmission. In parallel with its air interface development, LTE is linked closely with the concurrent System Architecture Evolution (SAE) project to define a simplified system architecture and Evolved Packet Core (EPC) network. Together, these projects provide a framework for increasing capacity, improving spectrum efficiency, improving cell-edge performance and reducing latency for real-time services such as video. They aim to offer a 100 Mbps download rate and 50 Mbps upload rate for every 20 MHz of spectrum. Support is intended for even higher rates, to 326.4 Mbps in the downlink, using multiple antenna configurations.

Rather than further developing current High Speed Packet Access (HSPA) and modulation schemes based on the Wideband Code Domain Multiple Access (W-CDMA) used in third generation UMTS cellular systems today, LTE downlink and uplink transmissions are based on new air interfaces: specifically, Orthogonal Frequency Division Multiple Access (OFDMA), a variant of Orthogonal Frequency Division Multiplexing (OFDM) in the downlink, and Single-Carrier Frequency Division Multiple Access (SC-FDMA) in the uplink.

The LTE specifications inherit all the frequency bands defined for UMTS, which is a list that continues to grow. There are now 11 FDD bands covering frequencies from 824 to 2690 MHz and eight TDD bands covering 1900 to 2620 MHz. Significant overlap exists between some of the bands, but this does not necessarily simplify designs since there can be band-specific performance requirements based on regional needs. There is no consensus on which band LTE will first be deployed, since the answer is highly dependent on local variables. This lack of consensus is a significant complication for equipment manufacturers and contrasts with the start of GSM and W-CDMA, both of which were originally specified for only one band.

Already used in non-cellular technologies as far back as 1998, OFDM was at that time under consideration by 3GPP as a transmission scheme for 3G UMTS. However, the technology was deemed inappropriate, in part because of the large amounts of baseband processing it required. Today the cost of digital signal processing has been greatly reduced, such that it is now considered a commercially viable method of wireless transmission for the handset. Rather than transmit a high-rate stream of data with a single carrier, OFDM makes use of a large number of closely spaced orthogonal sub-carriers that are transmitted in parallel. Each sub-carrier is modulated with a conventional modulation scheme (such as QPSK, 16QAM, or 64QAM) at a low symbol rate. The combination of hundreds or thousands of sub-carriers enables high-data-rate transmission with much reduced inter-symbol interference compared to conventional single-carrier modulation schemes with the same capacity.

Figure 1 MIMO assigns a different data stream to each transmit antenna.

In addition to new air interfaces, the LTE specifications require the use of multiple antenna techniques that add substantial complexity to the system and are designed to take advantage of spatial diversity in the radio channel. These techniques are often loosely referred to as “MIMO,” a term for multiple input, multiple output antenna configuration, and are considered essential for improving signal robustness and achieving goals for system capacity and single-user and “headline” peak data rates. The basic form of MIMO assigns a different data stream to each transmit antenna, as shown in Figure 1. The two transmissions are mixed in the channel, such that at the receivers, each antenna sees some combination of each stream. Decoding the received signals is a clever process in which the receivers analyze the fading patterns that identify each transmitter to determine what combination is present. The application of an inverse filter and summing of the received streams recreates the original data.

The theoretical gains from MIMO challenge the limits of system performance and are a function of the number of transmit and receive antennas, the radio propagation conditions, the ability of the transmitter to adapt to the changing conditions and the basic signal to noise ratio. Further complicating the picture is the requirement for the antennas to support LTE’s multiple frequency bands.

Since the LTE specifications support RF channel bandwidths of up to 20 MHz, compared to today’s maximum of 5 MHz, a fundamental change in radio design is required. New integrated designs, based on the Common Public Radio Interface (CPRI) and the Open Base Station Architecture Initiative (OBSAI) standards for base stations and DigRF and Mobile Industry Processor Interface Digital Physical Layer (MIPI D-PHY) for user equipment, remove or hide traditional test interfaces. Now, people who previously dealt in only one domain must learn new ways to characterize devices. An example might be a transmitter module, where only digital signal inputs and RF outputs are available and pre-correction in the digital domain sets RF performance. Measurement products and solutions specifically designed to address these emerging cross-domain requirements must include support for new methods to address mixed analog/digital radios, and simulation tools that can be integrated with real-world modules speed up overall system test. As well as targeted products such as Agilent’s N5340A/41A (OBSAI) and N5343A/44A (DigRFV4) testers, the tools required include traditional pattern generators, logic analyzers, signal generators and signal analyzers. However, new measurement methods involve combining them and interpreting results in new ways.

Figure 2 Simplified block diagram for testing a 2 x 2 MIMO receiver.

Testing MIMO receivers and systems under realistic fading conditions poses a new challenge due to the large number of transmit-receive channel combinations. For example, in a 2x2 MIMO configuration, using two separate channel emulators is not adequate to model the four separate channels that exist between the pairs of transmit and receive antennas. Testing in a “real” wireless environment is not an effective method as the channel is very sensitive, not controllable and not repeatable. Specialized instrumentation that emulates realistic MIMO channels provides the best solution for these challenging test conditions. Figure 2 shows one of several possible configurations for testing a 2x2 MIMO receiver. Using a software GUI, internal baseband generators and channel faders create the standards-compliant waveforms such as WiMAX, LTE and WLAN signals. Each fader can be independently configured with either standards-compliant or custom fading models, using a variety of path and fading conditions.

As with the original W-CDMA and now HSPA, chipsets for LTE are highly integrated and include data rates and functionality much greater than will actually be available to a single user in a network at introduction. They must be designed to have as long a life as possible so that manufacturers can recover their massive investment costs over a longer period. Therefore, developers must be able to confirm correct operation up to the maximum design specifications of the chipset, and they require test equipment today that is capable of this level of performance. For LTE, this includes single-user data rates up to 14.4 Mbps and MIMO functionality beyond the baseline specification in both uplink and downlink.

The participation of the world’s leading test equipment companies in the specific sub-groups responsible for defining measurement techniques in next generation cellular systems continues to ensure that the ability to test functionality is designed in, making interoperability, conformance and production test easier and faster.

Today, LTE standards-setting is nearly complete, early development is well under way and networks are expected to be commercially introduced in approximately 2010. This goal is aggressive, and there is a greater focus than previously on finalizing interoperability and conformance testing specifications, to ensure that manufacturers can produce both network and user equipment to meet demand. A new initiative for LTE—the LTE/SAE Trial Initiative (LSTI)—aims to accelerate the availability of interoperable next generation LTE mobile broadband systems. With more than 17 active participants in total, this unique global initiative is able to drive the seamless introduction of end to end LTE solutions—including infrastructure, devices and chipsets—through collaborative technology trials and proof of concept work.

The latest laboratory and early field tests on prototype LTE systems have confirmed that baseline devices can achieve download speeds exceeding 100 Mbps, and high performance systems using 4x4 MIMO antennas can push this to beyond 300 Mbps. LSTI members have also demonstrated substantial improvements to network response times, which are essential to give the ‘always on’ experience and for latency-sensitive applications such as interactive gaming and mobile television.

The next steps in LTE system development will be early device interoperability testing, network interoperability tests and more comprehensive performance tests. Conformance tests will ensure that equipment meets the benchmarks specified in the LTE standard. How those tests correlate with real-world performance for individual users remains to be seen.

With widespread commercial deployment of LTE still a few years away, it may be some time before the behavior of the new systems is completely understood. In the meantime, test equipment suppliers are playing their part, adding new measurement capability for cross-domain testing, and new features such as MIMO precoding for signal generation and advanced emulation, measurement and analysis to their test equipment to facilitate LTE product development and move the industry ahead.

Andy Botka received his BS degree in electrical engineering from the University of California at Davis and has done executive coursework at Harvard Business School. He began his career in 1987 with Hewlett-Packard Corp. as an applications engineer supporting high-frequency component test solutions. Subsequently, he has held a wide variety of positions in HP and Agilent over the past 20+ years. He has held various leadership positions throughout his career in the US and Asia, and is currently leading the Signal Sources division for Agilent Technologies, as vice president and general manager.