Cost pressure, increasing SATCOM dependence and a "do more with less" mantra coming from the halls of the Pentagon, government agencies and commercial satellite owner/operators are leading to innovative thinking from vendors and users alike. Eric DeMarco, President and CEO of Kratos Defense & Security Solutions, has said, "In Operation Desert Storm 1991, the military used approximately 140 bits per second (bps) of satellite bandwidth per deployed person. During Operation Noble Anvil in Kosovo, the U.S. component of this mission increased to almost 3000 bps. In operation Enduring Freedom in Afghanistan, bps usage per person increased to approximately 8300, and by the launch of operation Iraqi Freedom in 2004, bps per person had reached 13,800." In more recent years, this sort ofbandwidth increase by the U.S. military and National Security agencies has only continued, with the same trends appearing in the commercial sectors as well.

The need to drive down costs, while at the same time increase efficiencies and system life- span is leading to designs that include redundant, fault-tolerant systems with the ability to share network resources. Networks between satellites, for example, allow them to serve as backups for each other. Here, if the star tracker failed on one satellite, another in its constellation could provide the necessary pointing data to allow all of them to continue performing their functions. Or, if one satellite failed completely, others could detect its loss, and its mission activities would automatically be parceled out to other functioning satellites until a replacement satellite could be injected into the affected constellation.

These trends, in addition to recent efforts in commercialization of space, affect everything from R&D to deployment, and from testing to training. In the R&D space, for example, the latest hardware-in-the-loop instruments and tools for RF link simulation and modeling are enhancing traditional R&D methods. In deployment, interference detection, geolocation and mitigation techniques are rapidly moving forward. Tools being employed in these areas are being designed for multi-purpose use, including test and training.

In response to these trends, vendors are providing advanced, commercially available technology solutions that offer greater efficiencies and lower costs. Also, they are easier to operate, manage and upgrade. The remainder of this article will briefly summarize several of the latest technology advancements, including channel simulation, RF interference detection and mitigation, geolocation, high efficiency, low cost power amplifiers and increased frequency and data rates for critical deployable communications systems.

Figure 1 Example Doppler shift curve for a LEO satellite signal received at an Earth terminal.

Channel Simulation

Satellite communication systems involve transmitters and receivers that are constantly moving with reference to one another, routinely operate in harsh environments and are separated by great distances. As a result, these radio signals undergo time-varying carrier and signal Doppler shift, path loss and path delay, as exemplified in Figure 1. These signals are also subject to atmospheric noise and weather-related perturbations, as well as accidental interference and intentional jamming.

Imperfect satellite transponders also impact SATCOM signals with frequency-dependent group delay, phase and amplitude performance, to name several. Multipath also plays an important role in signal quality, particularly where buildings and terrain may result in terrestrial vehicle motion- and position-dependent destructive and/or constructive interference at the receive end (e.g. Rayleigh and Rician fading).

Each of these factors, singly and in combination, must be rigorously simulated during SATCOM system design. To do so, engineers initially employ powerful, physics-compliant software-based simulation and modeling of these signal effects against planned receiver and transmitter specifications, link budget requirements, antenna positions and gain patterns and a wide variety of other key considerations.

As the development cycle proceeds, software-based simulation and modeling gives way to actual transmitter and receiver hardware/firmware/software testing in order to rigorously test these systems with actual RF signals. Physics-compliant RF channel simulator instruments that add deep hardware-in-the-loop test capability facilitate this testing. Typically, such simulators are inserted in the RF path between transmitters and receivers under test, and are driven by the same high fidelity simulation and modeling tools used earlier in the design.

In this way, channel simulators create dynamic, nominal and worst-case RF paths in the lab for emulating signals between fixed or moving terrestrial assets and space platforms (satellites, manned vehicles, rockets, etc.), inter-linkages between platforms, links between space vehicles and atmospheric assets (UAVs, missiles, aircraft, etc.) and atmospheric vehicle links to terrestrial assets.

Channel simulation technologies advance as the resolution and speed of ADCs and DACs emerge, and as DSP speeds and DSP resource availability climb. Intuitive visual user interfaces enable channel simulator users to focus on their jobs, not on eliciting desired behavior from their instruments. These advancements, as well as extended RF frequency coverage and wider instrument bandwidths, have caused a marked jump in channel simulator use in R&D, test and training activities worldwide in the past several years. RF channel simulators are increasingly finding their way into signals, interference and operations situations because they create RF signals that precisely mimic those that will occur in nature under various conditions.

RF Interference Detection

Satellite communication is critical to economic and national security. With the proliferation of satellites, aircraft, UAVs and other platforms requiring radio linkages, the skies are jam-packed with radio signals, each of which is increasingly compromised by natural, accidental and intentional interference. This interference threatens the integrity, quality and speed of these links, and therefore, the very missions they support.

Figure 2 Example interference detection plot showing authorized and unauthorized signals.

As a result, we have seen rapid development and deployment of systems that continuously monitor SATCOM links for even the smallest signs of interference or channel abuse. These monitoring systems have moved well beyond spectrum analyzers running simple spectral masks that define nominal frequency and amplitude characteristics. Instead, modern interference detection systems employ sophisticated DSP techniques to detect and characterize even the smallest and most transient anomalies, including unauthorized signals appearing below authorized signals as in Figure 2.

These interference detection systems also log results and instantly notify appropriate automated systems and personnel when unauthorized signals appear, and when critical nominal signal parameters, such as EIRP, C/No, Es/No, center frequency, occupied bandwidth, are violated. Often employing multiple remote sensors, today's interference detection systems combine signal data from geographically dispersed fixed and moving nodes into single displays for real-time overall RF situational awareness.

Further development of such systems is proceeding rapidly, with key focus on faster detection of a broader range of interference types in wider frequency segments. Reductions in system size, complexity and price are anticipated, along with improved ease of use. Tighter integration with related automatic functions, such as interference mitigation, traffic re-routing, and signal geolocation, are also on the horizon.

Addressing the continual need for economic efficiency, many interference detection systems are multi-purposed, and are applicable toward a broad range of operational, training and test requirements. This is achieved in part with integrated hardware channel simulators and signal generators capable of injecting physics-compliant target and/or interference signals indistinguishable from their real-world counterparts.

As interference detection system capabilities advance, and as automatic avoidance techniques are developed, the physical size of interference detection systems will decrease. This will spawn "built-in" interference detection and mitigation capabilities, rendering communication system receivers, for example, self-aware of interference and capable of taking link-restoring corrective action without user intervention.

Geolocation

In addition to carrier monitoring and RF interference detection, the ability to accurately locate the source of interference is becoming more important. Interference incidents are on the rise, and will continue to grow due to the proliferation of satellite based services, such as VSAT networks, the emergence of personal satellite communications and the ever-increasing congestion of the geostationary arc.

The most common use of geolocation tools is for locating the source of an interfering signal. More often than not, interference incidents are accidental in nature. Equipment failure, a poorly pointed VSAT antenna, the use of an incorrect frequency or satellite, or RF spill-over from poorly specified ground station transmitting to an adjacent satellite are some examples of how interference can be caused.

Increasingly though, geolocation tools are used to locate intentional jammers who aim to disrupt specific TV or radio broadcasts, or command and control or data communications. Geolocation systems can also be used to detect unauthorized users who pirate bandwidth for their own use. Geolocation technology has advanced a great deal, including improvements in usability, accuracy, processing speed and integration with other adjacent tools. New measurement techniques and analysis of more advanced signals also are being seen in the market. Flexibility and scalability of these tools also continue to improve, allowing for incorporation into widening applications including transportable and mobile systems.

Advancements in User Terminals

In recent years, increasing bandwidth demands from remote users has spurred advances in VSAT technology. Improvements in satellite EIRP and G/T have enabled the use of smaller, low gain user terminal antennas. In conjunction with satellite performance improvements, these smaller antennas meet performance benchmarks that once required larger antennas. User terminals now provide significant throughput and availability in a compact form factor (MANPACK, aircraft mounted, vehicle mounted, etc.).

User terminal advances leverage these space segment performance gains to improve the quality and quantity of information, whether fixed or mobile, that is available to the remote user. Solutions are being packaged in size, weight and power profiles that enable communications support that were previously underserved or not served at all. Advances in waveform and coding, embodied in standards such as DVB-S2/ACM, have served to decrease the amount of power required to close a link and have improved link quality and availability. Improvements in RF amplifier size, weight and consumed power have created SSPA/BUC elements that provide significant available transmit power in a small form factor compatible with battery-operated terminals and challenging user environments.

Figure 3 Data service rates for airline carry-on satellite communications kits.

Figure 3 shows an airline carry-on sized terminal package to show the significant performance improvements gained in recent years. About 10 years ago, the most efficient terminals supported connection speeds of 64 Kbps. A little more than five years ago, connection speeds increased significantly, but only to about 492 Kbps. Today, due to the advancements discussed above, users can now access approximately 9 Mbps, resulting in a dramatic increase in capabilities to support the end user.

Power Amplifiers

To cut costs, the industry is relying on smaller antennas. Yet at the same time, users are demanding higher data rates and increased availability. Satisfying both requirements has been a problem in the past, due to available power limitations. Traditionally, engineers had two choices: Solid-state Power Amplifiers (SSPA) or Travelling Wave Tube Amplifiers (TWTA). While older SSPAs offered many reliability and application advantages – better linearity, reliability, noise and lower cost-of-ownership – they provided poor efficiency for the size/weight offered. On the other hand, relying on old, vacuum-based TWTAs was costly and difficult to manage.

In the last year, 25 to 200 W Ku-band SSPAs have been introduced in the market. These products offer an unprecedented combination of small size and efficiency, equaling and/or surpassing the efficiency of typical TWTAs. For example, these new, more efficient SSPAs achieve typical saturated efficiency ranging from 30 percent for the lower power amplifiers to 24 percent for the 200 W offerings. These are dramatic improvements because, historically, SSPAs operated at 10 percent efficiency or lower. The weight of the new SSPAs range from three pounds for the 25 W SSPA to 23 pounds for the 200 W SSPA. The more advanced SSPAs match TWTAs specification-for-specification in saturated efficiency and size/weight. However, SSPAs have much better linearity and superior size/weight and power consumption for equivalent linear power, while bringing its traditional advantages in reliability, power savings, noise and total cost-of-ownership.

SUMMARY

Increasing SATCOM dependence is driving many critical technology areas toward improved reliability, increased throughput, smaller size, lower cost and/or lower power requirements. Channel simulation in lab and test applications enhances design quality and reliability, and results in designs that perform well even with accidental and intentional interference.

RF Interference Detection systems monitor critical links and provide early warning of impending link degradation due to equipment failure, operator errors or intentional interference. Geolocation systems then assist in mitigation steps and locating the source of interference for rapid link restoration.

Portable user terminals have improved specifications and capabilities, many of which are the result of advancements in power amplifier technologies. These advancements, and many others, enable and are responsive to steadily ramping SATCOM usage requirements. n

Steve Williams is an RT Logic Business Area Manager, responsible for R&D and business development activities for RT Logic's RF Channel Simulator, Range Test System, UAV/Target/Missile Test Systems, Spectral Warrior Interference Detection/Characterization Systems and high-rate digitizers. He is a frequent presenter and author on these and related subjects. His 30-year digital and RF engineering career has included R&D, management and business positions at RT Logic, Hewlett-Packard, Agilent Technologies and precisionWave Corp., which he co-founded. Williams holds a BSEE from the University of Illinois.