Microwave Journal

5G NR Challenges and Trends in RFFE Design

November 14, 2023
Figure 1

Figure 1 Key capabilities of 5G NR compared to 4G LTE-A.1

The proposed capability enhancements from 4G LTE to 5G new radio (5G NR) are a massive leap designed to boost mobile telecommunications applications and enable many more opportunities. Aside from significantly improving all major performance metrics, the 4G to 5G transition also heralds a more flexible and capable radio architecture with additional mmWave frequency spectrum on top of legacy 4G LTE-Advanced (LTE-A) and new sub-6 GHz frequency bands. 5G also intrinsically supports new use cases beyond enhanced mobile broadband (eMBB), including ultra-reliable low latency communications (uRLLC) and massive machine-type communications (mMTC). Plans also exist to further expand the 5G frequency bands to cover both licensed and unlicensed mmWave spectrum. Moreover, 5G NR allows for both frequency-division duplex (FDD) and time-division duplex (TDD) operation with wider channel bandwidths, user equipment (UE) with increased maximum power, higher-order modulation schemes and multi-antenna architectures. 5G NR RF front-end (RFFE) designers benefit from understanding these trends and aspects of the new RF hardware and technologies needed to address these new challenges.


The 5G NR deployment ramp up is in full swing with many organizations striving to achieve 5G NR performance goals. The performance goals for 5G, along with how they compare with 4G are shown in Figure 1. Figure 2 shows some of the new 5G NR spectrum allocations. The mmWave frequency ranges (FR) that are becoming known as FR2-1 and FR2-2 are of great interest for several reasons. There is a large amount of available bandwidth, these bands lack other interfering deployments and the size of the RF hardware elements and the antenna are proportionally smaller. Perhaps a bit counterintuitively, proponents are viewing the increased atmospheric attenuation at these frequencies as a benefit that can aid in mitigating interference.

Figure 2

Figure 2 3GPP 5G NR band definitions.

FR2 mmWave technology allows for advanced/active antenna systems (AAS) that are extremely compact, along with sophisticated MIMO and beamforming systems with higher throughput compared to 4G LTE-A technologies. The 5G NR enhancement over 4G LTE-A and new frequency bands enable greater capacity, connection density, peak data rates and user-experienced data rates. 5G NR also comes with increased modulation schemes, new encoding and additional layers that support new use cases. These combined features in the 5G NR standards empower 5G to boost mobility, reduce latency, enable higher network energy efficiency and provide better spectral efficiency than 4G LTE-A.

5G NR introduced the concept of a bandwidth part (BWP) as a set of contiguous resource blocks (RBs) that can be set to different transmission bandwidths. Each BWP can have its own numerology and while multiple BWPs can be defined for a given carrier component (CC), only one BWP can be active at a time in downlink (DL) and uplink (UL). The introduction of BWPs enables far more flexible use of spectrum based upon individual UE use cases.

Figure 3

Figure 3 5G NR FR1 and FR2-1 frequency bands and subcarrier spacing.

Compared to a maximum channel bandwidth of 20 MHz per CC for 4G LTE-A, 5G NR FR1 can use a 50 MHz maximum channel bandwidth per CC with 15 kHz subcarrier spacing (SCS) and 100 MHz with 30 kHz or 60 kHz SCS. FR2-1 allows for 200 MHz when using 60 kHz SCS and up to 400 MHz when using 120 kHz SCS. The subcarrier spacings for different time slots are shown in Figure 3. Table 1 shows the higher SCS and maximum transmission bandwidth allotments that are being considered for the 52 to 71 GHz frequency range of FR2-2.

Table 1
Figure 4

Figure 4 The three major initial use cases for 5G NR.1

Figure 5

Figure 5 Asynchronous TDD operation where Band X TX could interfere with Band Y RX.

4G LTE-A was essentially designed for mobile broadband, though it was possible to configure 4G LTE-A to support other functions within that framework. 5G NR has additional use case features built into the standard that support new applications for cellular wireless technology. The three initial and key use cases for 5G NR are eMBB, uRLLC and mMTC. Each use case has details of the specification and features designed to support the use case in ways that would not be feasible with a one-size-fits-all solution.

As an example, uRLLC requires much lower latency and higher mobility for vehicle-to-infrastructure (V2I) or vehicle-to-vehicle (V2V) applications for autonomous vehicles or driver safety features. However, uRLLC applications do not prioritize capacity, peak data rate, UE data rate or spectral/network efficiency as much as eMBB applications do. Similarly, mMTC use cases prioritize connection density and network efficiency over other performance metrics to better serve communications among multitudes of machine-type sensors, actuators and beacons. Figure 4 shows a spider diagram of the relative importance of the 5G NR network goals to the three main use cases.


The performance and capability enhancements of 5G NR versus 4G LTE-A bring additional challenges that RFFE designers must address. One of these challenges is TDD asynchronous inter-band operation. This becomes especially important when the transmission noise of one band falls into the receive band of a second band when they share an overlapping time slot. There are over 50 UL versus DL slot allocation formats for use in any one band and there are multiple band combinations where this overlap could occur. The significance of this potential interference and network operation is a function of the noise levels, intermodulation (IMD) products and filter rejection levels of the hardware. Figure 5 shows the extent of this potential problem.


The increased complexity of additional bands and uplink band combinations of 5G NR compared to previous generations leads to an increased risk for self-desense or self-interference, especially in the sub-6 GHz frequency band. Higher maximum and average power levels with 5G NR TDD, as shown in Figure 6, are other factors to consider. 5G NR presents more than 60 FR1 band definitions with over 3500 carrier aggregation and dual-connectivity (DC) band combinations, along with the potential for asynchronous operation. This includes multi-mode operation with the UE and base stations operating LTE-A and 5G NR transceivers simultaneously (EN-DC). If any of these combinations result in substantial IMD distortion products, leakage, noise or other interference injected into the receiver, the receiver’s sensitivity is reduced.

Figure 6

Figure 6 TDD with duty cycle applied to the maintain average power at 23 dBm.

Figure 7

Figure 7 Potential self-desense/self-interference mechanism in a 5G CA transceiver.

Maximum sensitivity degradation (MSD) is the metric in the 5G NR standard that defines the permissible degradation of the receiver’s sensitivity for a particular band combination. MSD is the amount the RX sensitivity (REFSENS) is degraded for a particular band combination. This value depends upon parameters like maximum TX power, isolation levels, linearity, bandwidth and carrier frequency. These factors also include self-interference caused by coupling and crosstalk effects between the various functional blocks of the RFIC, RF modules, phone board and the entire RFFE.

Figure 7 provides an example of the self-desense effect between two 5G NR FR1 frequency bands. In this case, the UL transmission of Band n78 (3300 to 3800 MHz) is coupling, possibly from multiple locations in the UL signal routing, to the UL transmission of Band n3 (1710 to 1785 MHz uplink). This causes intermodulation products at the Band n3 DL receive signal chain of the RFFE (1805 to 1880 MHz). This is the case shown in Figure 8. In this example, Band n78 is a TDD band while Band n3 is an FDD band, so the UL and DL of Band n3 are at different frequencies. However, the second-order intermodulation distortion products (IMD2) created by the mixing of interference coupled from the UL of Band n78 and the UL of Band n3 into the DL of Band n3 could result in high IMD products falling within the Band n3 DL. That signal may contain enough energy to desensitize the receiver in Band n3, worsening the bit-error rate due to SNR degradation and degrade throughput.

Figure 8

Figure 8 PSD in dBm/Hz versus frequency (MHz).

Figure 8 graphically shows the mechanism just described. It combines a plot of the power spectral density (PSD) of the 5G NR Band n3 UL, Band n78 UL and the IMD2. We see that overlapping UL signal components in the transceiver create distortion in the DL of Band n3.

Increased Channel Bandwidth RFIC Impairments Causing Desense

Increased 5G NR channel bandwidths in the sub-6 GHz FR1 bands compared to 4G LTE-A create a similar challenge. The channel bandwidth of many of the FDD LTE-A bands being used in 5G NR FR1 has increased without increasing the duplex spacing. This channel bandwidth increase creates additional opportunities for RFIC impairments to impact the transmission output signal quality. If there is coupling between the transmission and receive signal chains, these impairments could impact the receiver sensitivity. Examples of these impairments could be an image signal generated during frequency conversion because of IQ mismatch or LO leakage. The power amplifiers (PAs) in the transmitter could exacerbate the level and bandwidth of these RFIC impairments in the receive band compared to previous 4G LTE-A levels.

As an example, consider 5G NR FR1 Band n28 with an uplink frequency range of 703 to 748 MHz and a downlink frequency range of 758 to 803 MHz. Band n28 is an FDD band with duplex spacing of 55 MHz and 5, 10, 15, 20 or 30 MHz channel bandwidths. An image signal from IQ mismatch or LO leakage may pass through a nonlinear PA resulting in distortion products that overlap the DL receiver frequency range, which leads to desense. Figure 9 shows this path on an RFFE block diagram and Figure 10 shows the third and fifth odd-order IMD products and image power levels, along with the potential desense of the DL receiver in 5G NR FR1 frequency bands.

Figure 9

Figure 9 Band n28 coupling interference from the transmitter UL to the DL receive signal chain.

Figure 10

Figure 10 IMD products and image power levels leading to DL desense.

As described earlier, the desense phenomenon is influenced by channel bandwidth. Figure 11 shows simulated RFIC output, including impairments and non-linearities along with PA transmitter output for a 20 MHz and a 30 MHz channel bandwidth. The red rectangle represents the UL and the blue rectangle represents the DL frequency ranges for 5G NR FR1 Band n28. Figure 11 shows that a 20 MHz channel bandwidth will cause limited DL receiver desense because only a fraction of the final nonlinear products can interfere with the DL frequency range. However, with a 30 MHz channel bandwidth, a larger portion of the fifth-order IMD coming from the RFIC falls in the Band 20 DL. Both the IMD3 and IMD5 products, originating in the RFIC and amplified by the PA, are worse for the increased bandwidth case, which is enabled by 5G NR.

Figure 11

Figure 11 Plots of simulated RFIC and PA transmitter output.

Increased RF Bandwidth and Channel Bandwidth Amplifier Considerations

Besides the potential for self-desense and self-interference, other challenges emerge because of increased RF and channel bandwidths. Selecting appropriate low noise amplifiers (LNAs) for the extremely wide 5G NR FR1 Bands n77 (3300 to 4200 MHz), n78 (3300 to 3800 MHz) and n79 (4400 to 5000 MHz) is a hardware consideration. Each band also supports channel bandwidths as high as 100 MHz per component carrier. There are several options for LNAs in these frequency ranges, each with its advantages and trade-offs. Common source LNAs with inductive degeneration exhibit low noise figures (NFs), but they also have relatively narrow fractional bandwidths. The NFs of common gate LNAs with inductive degeneration are slightly inferior to common source LNAs, but they have wider fractional bandwidths. A programmable LNA is another option, though a designer would need to consider that the tuned performance of these LNAs depends on the carrier frequency. Lastly, a wideband LNA with low NF and high gain may also be suitable, though additional filtering may be necessary. Reasonable wideband gain results from combining multiple LNAs with slightly overlapping bandwidths. This technique is shown in Figure 12.

Figure 12

Figure 12 Overlapping LNA gain and NF to increase bandwidth.

Figure 13

Figure 13 High-level schematic of an ET modulator and PA.

There are other considerations with PAs. High efficiency is desirable and current consumption becomes a significant concern and these challenges become more difficult with wider channel bandwidths. This is especially true for Bands n77, n78 and n79. Envelope tracking (ET) is a common technique to achieve reasonable levels of efficiency. Figure 13 shows a simple ET circuit, but this circuitry and design becomes more complex when channel bandwidths increase beyond 100 MHz. Additionally, it is difficult to avoid asymmetric adjacent channel power leakage ratio issues with many modern PA technologies because of memory effects in the PA.


Figure 14

Figure 14 Example of an RB placement solution for spectrum allocation.

Figure 15

Figure 15 Shifting the UL carrier frequency to minimize self-desense for cell-edge handsets.

Some methods exist to address these RFFE design challenges and some developments are needed. One solution to tackle potential DL desense issues is to use network optimization techniques that involve spectrum allocation with RB placement. This could be part of a larger, intelligent throughput-driven spectrum allocation scheme. Using RB placement for 5G NR FR1 spectrum allocation can result in minimizing UL interactions that could desense DL receivers in the same band or nearby bands. An example of an RB placement solution for spectrum allocation to minimize DL desense occurrences with 5G NR FR1 is shown in Figure 14.

RB placement can be done with relatively simple algorithms and lookup tables. However, cognitive radio techniques have also been proposed to handle real-time spectrum allocation challenges and minimize DL desense more intelligently. Using machine learning/artificial intelligence with cellular resource allocation could enable better spectrum optimization. This might allow planners to consider cellular activity and potential interference from other wireless networking technologies and noise/interference generators. This requires substantial development in cognitive radio technology and protocols for facilitating cognitive networking and cognitive radio interactions.

For specific combinations that may suffer desense or interference from IMD, the UL carrier frequency could be shifted slightly to minimize self-desense for cell-edge handsets. Referencing the example presented in Figure 15 with 5G NR FR1 Band n28, a slight shift of the UL carrier frequency would shift the IMD3 product that would normally overlap with Band n28. This would also allow for increased DL channel bandwidth as shown in Figure 15.

Additional RFFE Technology Developments

To address the changes and growing expectations for 5G NR performance and capability, additional developments are needed in RFFE hardware and systems. To accommodate multi-antenna AAS technology, these developments need to be extremely compact, readily integrated into panelized antenna solutions and more efficient. Wider bandwidth LNAs and PAs are needed to tackle the higher-channel bandwidths possible with FR1 frequency bands. Increased UL power requires high-power tolerance and high efficiency PA designs. More complex 5G NR modulation schemes increase the need for RF block designs with lower error vector magnitudes to ensure that those schemes can be successfully implemented. To reach the speeds and fidelity needed for 5G NR means linearity thresholds must increase for switches, PAs and LNAs to reduce the chance for self-desense. For 5G NR, mitigating self-desense and self-interference solely through RFFE component performance may be impossible. In the long term, the solution may involve adopting intelligent interference mitigation strategies.


The 5G NR specifications are consistently pushing the boundaries of wireless networking performance and adopting new features and use cases. These advances, though likely to continue to usher in a new age of connectivity, are also placing new burdens and creating additional challenges that RFFE designers and network optimization engineers must tackle. Ultimately, new strategies and device design/development are needed to address these challenges, but these solutions must also be extremely compact, efficient and cost-effective.


  1. “IMT Vision – Framework and Overall Objectives of the Future Development of IMT for 2020 and Beyond,” International Telecommunication Union, ITU-R Recommendation M.2083-0, Sept. 2015, Web: https://www.itu.int/dms_pubrec/itu-r/rec/m/R-REC-M.2083-0-201509-I!!PDF-E.pdf.
  2. “NR; User Equipment (UE) Radio Transmission and Reception; Part 2: Range 2 Standalone,” 3GPP, Technical Specification, 38.101-2 V17.7.0.