Long Term Evolution (LTE) is the project name given by 3GPP to the evolution of the UMTS 3G radio standards. The original UMTS Terrestrial Radio Access (UTRA) is based on W-CDMA technology, which has been continuously enhanced to include HSDPA & HSUPA (HSPA). The work on UMTS continues in Release 8 of the 3GPP standards with enhancements to HSPA. In addition, Release 8 includes E-UTRA an entirely new air interface based on OFDM technology.


Offering higher data rates and lower latency for the user, a simplified all-IP network for the operator and improved spectral efficiency, E-UTRA—or LTE as we will refer to it from now on—promises to provide many benefits. This article reviews some design challenges specific to LTE, and looks at the emerging test equipment being developed to help in the realisation of this new technology.

LTE Timeline

LTE is already more than a concept, with the study phase having started in late 2004 and a great deal of work being done to complete the release 8 standards. The many possible deployment options for LTE present one of the biggest challenges in designing and testing early user equipment (UE). The core specifications are currently scheduled to be complete by early 2008, and the first conformance test specifications should be available by late 2008. Limited quantities of working UEs may be available for field trials in 2009/2010. This is a very aggressive timescale for a new mobile technology, which will demand the availability of early and comprehensive test equipment (see Figure 1).

Baseband

Current High Speed Packet Access (HSPA) device performance makes large demands on processing power in a mobile device package. Prototype HSPA devices available today have difficulty providing these high data rates unless connected to a mains adapter, so LTE with significantly higher target data rates than today’s 7.2 Mbps will further challenge platform design.

The processing power required to support these data rates is phenomenal, particularly in baseband where all the error handing and signal processing occurs. LTE baseband functions include:

• Channel coding and scrambling

• Channel interleaving

• Adaptive modulation and Coding: QPSK, 16QAM and 64QAM

• Physical-layer hybrid-ARQ processing (HARQ), retransmission, incremental redundancy and chase combining

• Discrete Fourier transform (DFT)

Baseband designs will likely be modeled using PC simulation on both the UE and network sides and reduced speed emulation of hardware prototypes is also likely.

RF

There are currently 11 defined Frequency Division Duplex (FDD) paired bands and six Time Division Duplex (TDD) bands listed in 3GPP TR 36.803. All of these bands are also defined for GSM and UMTS and to date there is no specific spectrum allocated to LTE. Will LTE be expected to co-exist in the same bands with W-CDMA or GSM systems or will entire bands be re-allocated for LTE? All that is certain at this stage is that the LTE spectrum situation is uncertain. The number of combinations complicates the work required for co-existence studies and the resulting requirements and tests. The lack of a single defined band for LTE significantly complicates early development compared to the single band introductions for GSM and UMTS (W-CDMA).

Although there remains much uncertainty about which bands LTE may be deployed in we do know much more about the underlying air interface. By the time LTE mobile devices require RF test, there will have been significant understanding gained from WiMAX, which shares a very similar orthogonal frequency division multiplexing (OFDM) downlink. However, the LTE uplink differs somewhat from WiMAX and uses single carrier frequency division multiple access (SC-FDMA) to reduce peak-to-average power ratio (PAPR). This will create some specific LTE test needs. From TR 36.803 the expected requirements upon which tests will be based include:

Transmitter Requirements: maximum output power (MOP) and maximum power reduction (MPR); frequency error; power control (minimum output power, transmit ON/OFF power, out-of-synchronization handling of output power); control and monitoring functions; occupied bandwidth; UE spectrum emissions mask and ACLR for LTE; spurious emission requirements for LTE; transmit intermodulation; transmit modulation (EVM).

Tests based on these requirements will enable elimination of many typical RF impairments including I/Q imbalance, PA nonlinearities, oscillator phase noise, timing jitter in IF/RF sampling and mixing.

Receiver Requirements: reference sensitivity level; Maximum Sensitivity Reduction (MSR); maximum input level; adjacent channel selectivity (ACS); in-band blocking; out-of-band blocking; narrow band blocking; spurious response; wide band intermodulation; narrow band intermodulation; spurious emissions.

Performance Requirements: dual-antenna receiver capability; antenna correlation and gain imbalance; simultaneous unicast and MBMS operations; and dual-antenna receiver capability in idle mode.

One new challenge facing LTE UE will be the need to handle variable channel bandwidths. All previous 3GPP systems have had one channel bandwidth but LTE is being defined with eight different channel bandwidths varying from 1.4 to 20 MHz. Such flexibility allows for a rich set of new possibilities in deployment. However, this flexibility also presents significant new challenges in the way in which in-channel and out-of-channel requirements are specified, in the number of permutations for testing and in operational aspects related to Radio Resource Management (cell selection/re-selection, handover etc.).

One of the consequences of LTE’s variable channel bandwidth and the fact that a UE will typically be allocated a subset of the available resource blocks in the channel means that it is necessary to define limits on the energy a UE is allowed to transmit in unused resource blocks. The definition and requirements for this in-channel test are still under discussion but the vector signal analyser plot in Figure 2 shows the principle. This impaired signal has been generated using 0.1 dB IQ gain imbalance distortion in the transmitter. The impact of this distortion on an OFDM signal is to generate images of the allocated resource block in the other half of the signal equidistant from the centre frequency. The upper plot shows the subcarrier power and the lower plot shows EVM per subcarrier.

Layer 2/3

LTE Layer 2 is split into the following sub-layers: Medium Access Control (MAC), Radio Link Control (RLC) and Packet Data Convergence Protocol (PDCP). The functions of L2 include:

• Mapping between logical channels and transport channels

• Multiplexing/demultiplexing of RLC Packet Data Units (PDU)

• Traffic volume measurement reporting

• Error correction through HARQ

• Priority handling

• Transport format selection

• Segmentation and re-segmentation of PDUs that need to be retransmitted

• Header compression and decompression

• Ciphering of user and control plane data

Two significant design challenges will be the ciphering of significant amounts of data in PDCP, and the MAC turnaround time, which at 2 ms is six times faster than for HSDPA. Testing at high throughputs will be necessary to stress and highlight problems in these two key areas.

LTE Layer 3 includes the sub-layers Radio Resource Control (RRC), Mobility Management (MM) and Call Control (CC). L3 essentially deals with the main service connection protocols, such as:

• Broadcast of System Information and Paging

• Establishment, maintenance and release of an RRC connection

• Configuration of signalling radio bearer(s)

• Security functions including ciphering

• Mobility functions such as cell reporting for inter-cell and inter-RAT mobility and handovers, UE cell selection and reselection, and control of cell selection and reselection

• QoS management functions

The detailed specifications behind this broad overview of LTE L2 and L3 are still under discussion. Although early L2/L3 development will be accomplished with full-speed or low-speed simulation, it is not until L2 and L3 are integrated with baseband and the RF at full speed that the integrity of device design can be determined.

Testing The Complete Device

Test solutions for complete devices such as base station emulators with real-time protocol stacks or procedural script-based solutions cannot today be designed without a significant degree of proprietary input to account for the gaps in the specifications. Early solutions will be available within the next six to 12 months, but these will require modification until the specifications are finalised. Unlike previous generations of radio standards, the LTE conformance tests should be available well in advance of commercial service. This should help alleviate the interoperability issues, which commonly plague new technology at introduction. The expected availability of the conformance specifications during 2008 means that test equipment providers will be challenged to provide the necessary test coverage much earlier than would be normal, forcing an overlap with finalising the development of existing test solutions, for example, for HSPA+, EDGE Evolution and WiMAX.

User Experience and Real World Testing

Sustained user demand for new technology or applications is highly dependent on first impressions. The availability of web browsing via slow circuit-switched services or low data-rate early GPRS devices turned many potential users away from “surfing the mobile internet.” It is only now, with the advent of W-CDMA and HSPA, that data applications are gaining credibility. It is critical therefore that LTE delivers from the very start. Voice quality via the packet network needs to be at least as good as current circuit-switched systems, data services need to be both high speed and low latency, and inter-working with legacy systems needs to be seamless. Such perfectly reasonable customer expectations demand a thorough test regime prior to commercial launch.

The early availability of conformance test specifications will help with some of the basic testing, and ensure interoperability, but like today’s conformance tests, they will not be sufficient to ensure the perfect customer experience. As with 2G and 3G devices, much more functional test and verification will be required. While there are several hundred formal conformance tests for 2G and 3G there are perhaps ten times as many proprietary performance tests used by designers to stress test UEs in a similar way that they may be used in real life, using real data in real time.

Throughput

Perhaps the most visible capability LTE aims to provide is a much higher peak data rate, 50 Mbps uplink, 100 Mbps downlink, for single antenna rising to over 170 Mbps for 2x2 downlink MIMO. These figures represent the upper limit for the system design and practical figures will be scaled back as UE capabilities are defined. However, even at significantly reduced rates there will be many design and test challenges to overcome. Although not a UE design issue, cell edge throughput is very important. It is expected that LTE will be deployed using a single frequency network; however, in order to minimise adjacent cell interference and maintain cell-edge performance, a pattern of frequency re-use will likely be used at the cell edges.

Figure 3 shows the centers of all cells using the entire channel bandwidth (yellow), while the border zone of each cell uses a sub-set of the available resource blocks (multiple colors) based on a reuse pattern. Users near the cell centers will be able to utilize the entire channel bandwidth due to physical separation from adjacent cells. Users at the cell edge will be able to obtain good C/I on a sub-set of the channel bandwidth due to frequency clearance. More advanced methods of frequency clearing are possible based on location-specific resource block scheduling at the cell edges.

Performance targets for LTE are still to be defined, but it is important that a variety of scenarios are specified in order that performance in different conditions can be understood. The nature of the OFDMA air interface with its variable bandwidth, variable modulation depth, variable resource block allocation and variable adjacent cell interference profile compared to W-CDMA means that the number of possible test combinations is large.

MIMO

Multiple Input Multiple Output (MIMO) is required in order to achieve the headline peak data rates. Two types of MIMO are defined in the LTE specifications. Single User MIMO (SU-MIMO) is where two or more data streams are allocated to one user with the intent of increasing peak data rates. Throughput improves when the radio channel exhibits uncorrelated transmission paths. Multiple User MIMO (MU-MIMO) relies on the same principle of uncorrelated transmission paths, but in this case the paths belong to different users with the intent being to increase the capacity of the cell rather than increase peak data rates. Since MIMO requires multiple transmitters and receivers it was decided for UE cost reasons to only mandate 2x2 SU-MIMO for the downlink. This requires two UE receivers. For the uplink, only 2x2 MU-MIMO is assumed, which avoids the added cost and power consumption of two UE transmitters needed for 2x2 SU-MIMO. Although 4x4 MIMO is defined in the standards, this is probably only going to be practical for PC-based devices. For handheld devices, even the baseline two receiver configuration will place additional demands on battery life, and the extra heat generated will certainly provide additional thermal management design issues.

In the same way that peak data rates are often quoted without reference to the necessary channel conditions, the same is often true for MIMO. The headline figures quoted are usually a linear multiplier on the number of transmission paths. This is the theoretical potential but reality will be determined by the correlation between the paths. MIMO will probably work best indoors where there are slow changing conditions and no line of sight. MIMO cannot function with significant line of sight since it means the paths are highly correlated. In many outdoor environments line of sight is quite normal and at the cell edge, performance benefits are achieved using receive diversity rather than MIMO.

MIMO performance targets will be defined for specific channel conditions and although these will be carefully chosen there are reasons these will not be representative of real conditions. Actual performance will be highly dependent on unspecified antenna performance, polarisation aspects, body and head loss and different mechanical use modes as well as the dynamic conditions of the real channel. Antenna performance is further compromised by the need to support multiple frequency bands. With so many variables, specifying performance “over the air” to ensure satisfactory user experience is not realistic. MIMO receiver conformance testing will be straightforward; however, there is little information available today on how this simple form of test and the real world correlate. Real world testing of MIMO performance will be possible in due course with a visit to the local LTE network, although provision of repeatable real world emulation for early R&D using test equipment will prove to be much more challenging.

Battery Life

We live in a world where battery technology is struggling to keep up with ever more power-hungry mobile devices. GSM phones typically have a standby life of approximately seven to ten days, W-CDMA devices three to five days, and WLAN GAN devices using OFDM (albeit with little power control sophistication) are down to one to two days. What will be the battery life of an LTE UE with MIMO, capable of 170 Mbps? Optimising the battery life particularly when transferring at high data rates under realistic channel conditions will be critical to ensuring initial customer acceptance.

Conclusion

The design challenges presented by LTE are significant. However, the difficulties encountered during the introduction of new technology always appear far greater at the time than with hindsight. Fifteen years ago designers struggled with far less computing power, design tools, simulation and test equipment to provide us with GSM, which is now seen as simple compared to the technologies that have followed. And so it is likely to be with LTE.

Agilent’s unique LTE “Connected Solutions” brings together Agilent’s range of signal generation and analysis equipment with the ADS simulation environment to create a comprehensive test solution for the R&D engineer. LTE signals can be created in simulation using the ADS LTE Wireless Library and downloaded to an ESG or MXG vector signal generator to create real-world physical test signals for R&D device testing. UE output can be captured with an Agilent MXA Signal Analyzer, a PSA Series Spectrum Analyzer, or a logic analyzer, and then post-processed using the ADS LTE Wireless Library to perform measurements on RF and mixed-signal DUT hardware. Battery drain can be tested with existing Agilent analysis software and suitable power supplies.

These test solutions are just the start for LTE design and verification, with protocol development, protocol conformance tests and network emulation solutions yet to come. LTE may have many challenges, but with early and powerful test equipment solutions, the LTE challenge can be met.

Sandy Fraser joined Agilent Technologies (formerly Hewlett-Packard) in 2000 as a product marketing engineer. Prior to joining Agilent, he worked as a business development manager for TRAK Inc. for its Military and Space Division. During his career with Agilent Technologies, he has worked with One Box Manufacturing Test Instruments, including the Agilent 8922 and the Agilent E5515B/C. Today he is the product manager for GSM, GPRS, EGPRS and IS-136 test solutions for manufacturing and R&D.