Modern Radar systems also employ more complex pulse modulation signal formats to improve range resolution and lower the probability of intercept and jamming. Many Radar and Satcom systems operate at microwave frequencies (e.g., X, Ku and Ka bands), which helps to support wider modulation bandwidths, increased capacity, and also offers the benefit of smaller antennas.
In some cases, the wide bandwidths required exceed the intermediate frequency (IF) bandwidths of commercially available RF spectrum analyzers and vector (or FFT) signal analyzers. Coupled with the higher operating frequencies, this creates a significant set of challenges for RF engineers testing Radar and Satcom transmitters.
Quickly, accurately and cost-effectively measuring the performance of RF/microwave transmitters in today’s Radar and Satcom applications is a challenging task. In some cases (e.g., to measure a Satcom transmitter’s Error Vector Magnitude (EVM)), the transmitter output can’t always be measured directly. Engineers often have to rely on custom-built down-converter hardware to down-convert the RF/microwave frequencies to an IF frequency that can then be measured with commercial off-the-shelf (COTS) test equipment. Unfortunately, the non-recurring engineering costs associated with designing, building and testing the hardware can be counterproductive. The down-converter hardware also adds its own RF impairments that can mask the actual performance of the RF/microwave transmitter under test. Moreover, distortion may occur that contributes to the overall EVM being measured, making it difficult to discern how much EVM is from the actual transmitter output. With no other available option, many RF engineers are left with the measurement accuracy uncertainty that comes from this less than ideal approach.