Graham Warwick/WASHINGTON DC
The combat pilot of the 21st century is likely to have more on his head than in it. Advances in "virtual cockpit" technology could make flying an attack mission as easy as walking down the street.
While this lofty goal is still years away, the US Army plans to demonstrate in June next year that the virtual cockpit is a viable concept with military potential.
Under its Virtual Cockpit Optimisation Programme (VCOP), the US Army is bringing together five key technologies: colour helmet-mounted display, three-dimensional audio, tactile situational awareness, speech recognition and cognitive decision aids. The aim is to produce an "intuitive" cockpit.
"We want to decrease the mental reasoning an aviator needs to do the job. We want to make performing the mission more like walking," says VCOP lead engineer Scott Dennis. "When we walk, we don't think 'left foot, right foot'. It's intuitive; we think of a direction and go. We want to provide the same lack of cognitive reasoning in the cockpit."
To take the thinking out of flying, VCOP is integrating the technologies into a single system that could be installed in new production aircraft or retrofitted into in-service machines.
At the heart of VCOP is a full-colour, high-resolution, high-brightness, helmet-mounted display (HMD). This features virtual retinal display (VRD) technology, developed by Microvision, which uses eye-safe lasers to paint an image directly on to the pilot's retina. VRD provides colour capability with a brightness and resolution "no other HMD can match", says Dennis. This allows the system to display a 3-D moving map, which is the foundation of the virtual cockpit environment.
Decision support
Driving the HMD is the Rotorcraft Pilot's Associate (RPA), an intelligent information management tool developed by Boeing and flight tested last year in an AH-64D Apache Longbow. The RPA is a software system which provides pilots with real-time decision support by fusing data from onboard and off-board sensors, looking for information that could affect the mission plan and recommending actions to ensure mission success.
The pre-programmed route is stored in the RPA's terrain database. "If, during the mission, the sensors detect an air-defence unit, the RPA develops a new route dynamically, based on the context of the mission," says Dennis. If it is an attack mission, the system will recommend the best position for battle. "If it's reconnaissance, it will be the route with the lowest probability of being detected and the highest probability of seeing the target," he says.
The RPA also manages communications. "At pre-programmed points [along the route] the package will automatically send back location. So the pilot does not have to get on the mike," Dennis says. Instead, the system will fill in the report automatically and prompt the pilot for a yes or no decision, by switch or voice, on transmitting position.
In last year's Apache flight tests, RPA information was presented visually to the crew as two- and three-dimensional graphics on three head-down displays and a wide-field-of-view HMD. For VCOP, the system has been reconfigured to render all the information in a single channel for display on the VRDHMD. Intelligent decluttering of information will be used to prevent overloading the pilot.
The HMD will be the only conduit of information to the pilot. "The focus of VCOP is to increase the bandwidth of data flow to the pilot," says Dennis. "We rely on the pilot's ability to receive information in a three-dimensional world." The programme aims to exploit more than just the pilot's sense of sight.
The tactical situation awareness system (TSAS) under development by the US Navy for the Joint Strike Fighter and by NASA for the Space Shuttle, is a vest with an array of "tactors" which vibrate against the pilot's body to provide a non-visual means of conveying information on aircraft orientation. When the aircraft pitches nose down, for example, a tactor vibrates against the pilot's stomach; if he rolls left a tactor vibrates against his left side.
As the aircraft approaches low-altitude limits pre-programmed into the mission computer, "[tactors] in the seat of his pants might activate with an amplitude and frequency that conveys how far away he is from minimum descent altitude. He would intuitively and instinctively know to pull up, with no need for mental gymnastics," says Dennis.
Three-dimensional audio is another informational channel. This augments the 3-D visual display by providing aural cues in three-dimensions via the helmet. A warning of left engine failure, for example, seems to come from the left side of the cockpit.
VCOP is aimed to take this a step further, by creating "ear-cons" (the aural equivalent of display icons) to convey information intuitively. If the aircraft is running low on fuel, for example, the pilot might hear a dripping sound, Dennis says. If a message is coming in via datalink, the pilot might hear the familiar sound of a computer modem connecting, letting him decide whether to display the message or ignore it.
Aural and visual cues are tightly coupled in the VCOP virtual cockpit, as they are in the real world. The pre-planned route is displayed as a "highway in the sky" along which the pilot flies the aircraft. When a threat is detected, the pilot hears an intelligent voice warning - "air defence unit at 3km" (1.6nm) - from the appropriate direction, and sees a semi-transparent threat bubble overlaid on its position, the bubble's size conveying the danger zone. "We are not relying on only one sense organ," says Dennis. "We are using the pilot's ability to receive information in three-dimensional space."
If an engine fails, the localised audio warning is reinforced by a visual icon, perhaps an engine on fire, appearing on the HMD's appropriate side. "If the No 2 engine fails, and he hears a message in his right ear and sees an image in this right field of view, he's not going the pull the wrong engine off line," Dennis says. Potentially, the tactile vest could be triggered on the right side to further reinforce the aural and visual cues, he adds, "but this has not been researched".
Integration ahead
The TSAS is being integrated with the RPA in an Apache flight simulator at Boeing's Mesa, Arizona, helicopter plant. Microvision plans to deliver the full-colour HMD to Boeing for integration in the first quarter of next year. The US Army then plans to rehost the VCOP into a Sikorsky UH-60 Black Hawk simulator at Ft Rucker, Alabama, for the demonstration next June. The plan calls for flight tests to begin two years later, also in a Black Hawk.
The virtual cockpit will not become a reality overnight, and it may never see service in the form envisaged by VCOP, the army admits. Opportunities to insert some or all of the technologies are presented by planned upgrades of the AH-64 and Boeing CH-47 beginning in 2002, and by the start of Boeing Sikorsky RAH-66 Comanche production in 2006.
Bothell, Washington-based Microvision, meanwhile, has laid out a five-year roadmap for developing a flight-ready VRDHMD. This requires increasing resolution and field of view and well as strengthening and miniaturising the display system, says programme manager Jack Clevenger.
The VCOP HMD is a binocular display with 100% overlap, providing a 41í horizontal by 33í vertical field of view. Microvision has previously supplied monochrome HMDs for testing. "VCOP is the proof of concept for colour," says Clevenger. The company has also supplied a monocular colour system to the US Air Force and is working with several airframe and helmet developers on HMDs for fast jets.
Progress with HMDs for fixed-wing aircraft has been painfully slow. While helmet-mounted sights have seen operational use in Israel and elsewhere for several years, the US military is still 12 months away from receiving its first Joint Helmet Mounted Cueing Systems (JHMCS). Developed by Kaiser/Elbit joint venture Vision Systems International, the modular JHMCS will be the standard HMD for all USAF and USN fighters except JSF, which will have a more advanced, integrated display.
In its initial form, the JHMCS is a daylight-only system for cueing missiles and sensors. The next step will be to add night vision capability. Two options are being considered: injecting symbology into the standard night-vision goggles (NVGs) using a clip-on display module; or the Panoramic NVG under development by the USAF Research Laboratory. The latter uses four image-intensifiers to provide a 100í horizontal field of view. Demonstrations are planned over the next two years.
Beyond that, the full-colour VRD could be a future modular enhancement of the JHMCS, bringing the virtual cockpit closer to reality.
Source: Flight International