Crucial to US plans for next-generation air traffic management is the ability to maintain system capacity as visibility deteriorates and operations transition from visual to instrument meteorological conditions (VMC to IMC). Key to achieving this will be providing the pilot with situational awareness that is independent of visibility.

A promising approach poised at the transition between technology research and product development is combined synthetic and enhanced vision. Unlike sensor-based enhanced vision, database-generated synthetic vision is unaffected by weather and unrestricted by sensor field-of-view. But synthetic vision relies on accurate databases and precise positioning, while enhanced vision always sees the real world. Combining the systems promises to provide an unrestricted synthetic view of the outside world, while sensors monitor database integrity and detect unmapped objects.

NASA, the US Air Force and a team led by Rockwell Collins are winding up a research project that has demonstrated “virtual VMC” operations are possible by combining synthetic and enhanced vision. The SE-Vision programme included two phases of commercial and military flight tests involving a Gulfstream V and the US Federal Aviation Administration’s Boeing 727 testbed.

The GV flights at Reno, Nevada in August last year demonstrated complex visual approaches in simulated IMC conditions. The approaches were flown with and without sensor information integrated into the computer-generated terrain images on head-up and head-down displays. Sensors included the GV’s Kollsman infrared EVS and Collins’s WXR-2100 MultiScan weather radar, modified to provide database integrity monitoring by detecting terrain features and runway outlines.

The flights also demonstrated ways of detecting things not in the database, including aircraft or vehicles on the runway or taxiway. These included the Rannoch-developed runway incursion prevention system – essentially a version of the TCAS airborne collision-avoidance system able to detect transponder-equipped aircraft and vehicles on the ground.

Tests in New Mexico and New Jersey in June, using the FAA 727, focused on military low-altitude terrain-following flights using synthetic and enhanced vision. The aircraft was equipped with a Collins head-up display, head-down display and WXR-2100 radar and Max-Viz infrared EVS sensor.

In the integration concept developed for the SE-Vision programme, a perspective view generated by the on-board digital database is displayed with runways and features conformal to the textured terrain. Inset into this is the enhanced-vision sensor image, with its narrower field of view. Highway-in-the-sky flightpath guidance is overlaid on the combined display.

Wireframe terrain overlaid on the sensor image eases the transition from synthetic to enhanced vision as the runway materialises, particularly during curving approaches. “In cloud the sensor provides no information, and the wireframe provides awareness of where everything is. The pilot can start looking for cues as soon as the enhanced vision starts to show contrast,” says Tim Etherington, principal systems engineer at Rockwell Collins Advanced Technology Center.

“There are no technical barriers any more,” says Etherington, but it is up to the manufacturers, and the market, to determine when the concept will become reality. Synthetic vision and highway-in-the-sky guidance is a “natural” for very light jets, he says, as it cuts pilot workload while increasing situational awareness. “The issue is what is the value to large business jets and major airlines?” Complex approaches required to increase airport capacity could provide the business case.

GRAHAM WARWICK/WASHINGTON DC

Source: Flight International