During take off and landing flight phases, helicopters can encounter serious brown-out or white-out problems in dusty, sandy or snowy areas as the downwash of the rotor blades creates dust clouds that can completely engulf the helicopter. This is graphically illustrated in Figure 1. During landing in particular, the loss of visual references means that wires, poles and other obstacles represent a potential hazard for helicopters.

The substantial risk of flying in such brown-out/white-out conditions can result in the loss of orientation and positional awareness, which can lead to the following consequences:

• A hard landing, possibly exceeding the limits of the landing gear

• A collision with obstacles during in-cloud drifting or go-around

• so-called vertigo situation (spatial disorientation)

• Accidental rolling of the helicopter in the air or on the ground due to the unevenness of the landing surface

• Non-identification of approaching/moving objects


Current Measures

No optimal technical solution for brown-out/white-out conditions is available today, but those involved in the avionics industries worldwide are working to develop an aid to overcome this problem. With no technological solution currently available, however, the only current measure to reduce the risk is pilot training, which is limited in its effectiveness.

Possible Technologies

Work is currently being carried out to develop technology that can solve the problem. The optimal solution will be sensor technology that can provide a “see-through” visibility that has the capability to detect all kinds of obstacles, from large objects down to dangerous thin wires. As it is mostly military helicopters that are affected by brown-out/white-out conditions, however, it is essential that any solution does not increase the helicopter’s probability of detection/identification.

Recent technology evaluations have led to the perception that the optimum solution requires the combination of several (at least two) sensor technologies. This article will outline the use of several sensor technologies—radar and laser radar (ladar)—as well as database-driven capabilities, namely 2.5D/3D map and ground collision avoidance systems, whose complementary combination will provide the necessary solution.

The Sensors

Starting with radar, the maximum performance of a radar sensor system is achieved by combining the hardware modules with powerful radar algorithms, and resolution in azimuth is enhanced by applying synthetic-aperture radar (SAR) and inverse-SAR (ISAR) techniques. Also, range-Doppler processing facilitates the discrimination between moving and stationary targets, which is a basic feature needed in safety applications.

Secondly, ladar is analogous to millimeter-wave radar in its imaging capabilities, but uses laser beams to scan an area. Reflected laser light is processed to create a virtual picture of the area. Using pulsed-laser technology, ladars perform 3D scanning of the environment in front of the helicopter. The resultant geo-referenced 3D data is analyzed by sophisticated algorithms to identify helicopter endangering obstacles and terrain and provide reliable warnings.

The Solution

By fusing the sensors described, the ultimate aim is to extend the helicopter’s operational envelope to 24 hours a day, seven days a week, in whatever the weather can throw at it. This basically means flying in zero visibility to the naked eye, close to zero light conditions and in all weather conditions.

To achieve such a 24/7, all-weather situational awareness suite requires three integral steps. In Step 1, the landing aid combines a ladar 3D see and remember capability, which provides a high-resolution visualization of the landing zone, with a millimeter-wave radar system that detects approaching/moving objects in a 360° area around the helicopter (also known as the electronic bumper capability).

This solution provides obstacle collision/terrain approximation warning during helicopter flight and a visual landing aid/symbology with moving/approaching object indication for landing in a degraded visual environment.

The ladar’s 3D see and remember visualization is based on the raw data gathered during the landing approach. As long as the landing area is unobscured on approach, the raw ladar data will be used for area visualization with active obstacle/terrain collision warning, which is also stored in the unit’s memory. When the pilot’s view of the landing area starts to become obscured due to brown-out or white-out conditions, the pilot can command the memory freeze and remember function of the ladar with the electronic bumper function of the millimeter-wave radar.

The resulting degraded visualization is composed of the 3D landing area display correlated to the position and movement of the helicopter, the hover symbology in a so-called “followers view,” representing current height, drift, drift speed and the artificial horizon, as well as the warning display of the 360° electronic bumper.

The main features of this method are detection, localization and collision avoidance of stationary objects in the dedicated landing area, moving object identification (MOI) to ensure that no additional objects enter the landing area and analysis of the landing spot (for example, inclination and vegetation). By correlating radar data from various channels, a very precise drift measurement is available for dedicated operations and the electronic bumper avoids collisions with other helicopters during the landing operation.

Step 1 provides a solution for obstacle collision and approximating terrain protected flight in expected weather conditions together with an aid for safe landing in a degraded visual environment.

Step 2 extends the flight protection capability to a degraded visual environment. This can be realized by adding forward-looking millimeter-wave-based vision radar to the system and by improving the ladar with regards to multi-pulse processing and timing control. The resulting sensor data is combined in such a way that both data sets are digitally combined/fused and the resulting information is used to provide improved visualization as well as optimized obstacle collision and terrain approximation warning.

Figure 2 identifies the sensor coverage around the helicopter. Note that the scale of the forward-looking sensor range is not identical to the 360° electronic bumper range.

Finally, Step 3 brings to fruition a comprehensive 24/7 situation awareness suite for helicopters. This step adds a digital map system with flight/situation planning and Jeppesen support. The digital terrain elevation data (DTED) database for the map system is enhanced dynamically in-flight and/or during flight planning with current or recorded ladar 3D digital world data. The ladar system can generate and deliver, in real-time, high-resolution 3D digital world data in the standard DTED format with a resolution of level four or better. This data can be used on-board for the improvement of the map system or can be recorded for future flight/mission planning.

Further Developments

Studies are currently being conducted to characterize radar RF parameters of dust clouds in landing areas. The results will strongly influence the design of the optimum sensor system (for example, choice of frequency). Also under consideration are evaluations of ladar behavior in dust clouds with a view to improved capabilities in terms of receiver management, multi-pulse application and laser pulse timing.

Yan Christian Venot has his PhD degree in radar engineering and hyper frequencies, and is currently product manager for radar seekers and mm-wave sensor applications at EADS Defence & Security. He has experience in the managing of projects in the civil and military fields. His technical background is hardware design and realization, high frequency field simulation as well as system engineering in radar sensors and radar seeker applications.



Peter Kielhorn has his engineering diploma in information technology. He is currently project and product manager for military obstacle/terrain avoidance system (HELLAS-Awareness) and is responsible for future ladar applications at EADS Defence & Security. He joined the company in 1984 (at that time it was Dornier GmbH) and has experience in managing projects in military reconnaissance applications and the ladar equipment field. His technical background is systems/embedded software design and realization as well as system engineering for airborne reconnaissance applications.