According to the FAA, flight-deck automation confuses pilots too often.

David Learmount/LONDON

HIGHLY AUTOMATED aircraft with digital flight-management systems (FMS) often surprise pilots and sometimes leave them dangerously confused. This is the basic conclusion of the US Federal Aviation Administration from its two-year review of modern airline flight-decks.

Yet the FAA does not seem able to find a single solution. The problem is immensely complex, it says, indicating a need for the whole aviation "system" to recognise the importance of human factors (HF) in the modern flight-deck.

HF is not so much undervalued as misunderstood, the review concludes. What constitutes good HF for pilots in a highly automated environment has not been fully understood, probably because automation at its current level is such a new phenomenon that many of its effects could not have been foreseen. More study and some rethinking, the FAA hopes, will lead to changes in the way in which the industry designs the cockpits for highly automated aircraft, and the way it trains pilots for them.

Complicating the matter further, observes the FAA, is the fact that the system has an inherent resistance to the type of change needed. The FAA says in its review that the HF team, which carried out the review, recognises "-the economic pressures that inhibit making changes that may increase safety when there is not a strong tie to an accident. However, we believe that, if action is not taken soon, the vulnerabilities identified have the potential to lead to more accidents and serious incidents".

Critical interface

In carrying out the review, the FAA worked with NASA, the European Joint Aviation Authorities (JAA) and US universities. Its findings are revealed in a report entitled, The interfaces between flight-crews and modern flight-deck systems.

In automating the cockpit, the report implies, manufacturers and airlines knew exactly what the automation was intended to do and had assumed that it would make pilots' tasks easier. They did not, however, adequately consider the effect on pilots of removing some of their traditional aircraft-management tasks, or of adding automation-management tasks. The issue of training pilots, for operating highly automated flight decks was not considered in enough depth, says the report. HF Team co-chairman, NASA's Dr Kathy Abbott, says that most pilots report that automation systems-handling, is mostly learned "OTJ" (on the job), and that to become competent takes about a year of line operation.

Meanwhile, there is considerable evidence that pilot understanding of precisely what automation does and how it interfaces with aircraft-control systems is often poor. The report cites the fatal accident in 1994 to a China Airlines (CAL) Airbus Industrie A300-600 at Nagoya, Japan (in which the pilot attempted to overcome the auto-pilot by force) as the event which finally triggered the FAA's decision to set up the HF review.

The American Airlines Boeing 757 controlled-flight-into-terrain (CFIT) accident in December 1995 at Cali, Colombia was also cited as "a stark example of break-down in the flight-crew/automation interface". One of the key causes of this incident has only emerged since the HF report was published but, ironically, it adds weight to the FAA's argument. It was discovered that the established Arinc 424 convention for naming way points in FMS software differs from that used on navigation charts and, therefore, has the potential to confuse pilots.

CHANGING THE SYSTEM

The report calls for action, not just by manufacturers and designers, but in all sectors of the air-transport industry, explaining: "The issues identified by the HF team are highly interrelated and are evidence of aviation-system problems, not just isolated human or machine errors. Therefore, we need system solutions, not just point solutions, to individual problems."

The word "vulnerable" is frequently used by the team, meaning vulnerable to misunderstanding or even actively confusing for crews. In other words, some equipment or procedures are seen as "-a mistake/accident waiting to happen". Automation mode-confusion is given as the prime example.

The report's dominant theme is that human beings do not always react as the system expects them to, so the system has to be human-error-tolerant and able to reduce the chances of error.

In the minutes before the Nagoya accident, the human-factors situation was becoming progressively more complex. The flight-data recorder (FDR) and cockpit-voice recorder (CVR) from the aircraft reveal a picture of pilot confusion and eventual panic.

The CAL aircraft was established on a good-weather approach with the first-officer flying the aircraft manually with auto-throttle engaged. Go-around (GA) mode was activated unintentionally, and the crew's actions for the remainder of the flight indicate that they did not realise this, despite mode information displays and aircraft behaviour. GA mode caused the power to increase, but the throttle was then retarded manually. When the pilot subsequently engaged the autopilot, the GA demand increased the aircraft's nose-up attitude.

Instead of changing mode or disconnecting the autopilot, the progressively more-confused pilots tried to override the autopilot's GA-mode pitch-up action by pushing hard on the control column. Control-column force alone would have disconnected the auto-pilot in all other autopilot modes, but the original JAA certification requirements for the type had specified that, in GA mode uniquely, stick-force alone should not be able to disconnect it. If the autopilot had disconnected at that point, however, the accident probably would not have happened.

The certificators' philosophy was that selection of automatic GA is an emergency measure designed, above all, to enable pilots to make a late GA decision during a Category IIIB (blind) approach and automatic landing, which is a hands-off procedure. Europe declared, therefore, that pilots should not be able to tamper with the automation except by deliberately selecting autopilot disconnect. "Instinctive breakout," which the FAA had approved in aircraft types for, which it was the lead certification authority, was rejected.

In March 1996 the JAA dropped the requirement completely, directly as a result of the accident findings.

At Nagoya, the result of the pilot attempt to overcome the autopilot by force was that the autopilot-controlled horizontal stabilisers, responding to GA-mode demands, were gradually increasing pitch-up incidence toward their limit to lift the aircraft nose despite the crew's pitch-down elevator input.

When the panicking pilots finally disconnected the autopilot, their elevator input could not overcome the stabiliser's pitch-up demand. According to the FDR, the pilots still failed to select nose-down trim manually; manual trim operates by altering stabiliser incidence.

Finally, the aircraft's automatic "alpha-floor" [anti-stall] system activated, increasing power. Engine thrust added to the pitch-up moment, and the very steep climb now adopted caused the speed to decay. Nose-up attitude eventually reached 52í. The aircraft stalled and hit the ground tail-first just short of Nagoya's runway.

COMMON CAUSES

The accident was the result of a lethal cocktail of individual factors which, according to the HF team, are frequently found as components in other accidents, incidents and confidential reports. They include:

Auto-pilot mode-confusion;

Poor crew knowledge of "the automation's capabilities, limitations, modes, operating principles and techniques";

Inconsistent systems behaviour in different modes;

Crew engagement of the autopilot in an attempt to recover a desired flight-path when the pilots became confused;

Crew failure to follow standard operating procedures (SOPs);

"Potential mismatches with the manufacturers' [and certificators'] assumptions about how the flightcrew will use the automation";

Crew reluctance to disengage automation.

The HF team says that flight-crews tend to analyse unexpected behaviour in automated aircraft from observations of that behaviour, rather than from instrument displays, adding that this indicates that "-feedback mechanisms may be inadequate to support timely error-detection".

Observance of aircraft behaviour is not necessarily enough. The report quotes an example of an Evergreen Boeing 747-100 in 1991 in which the autopilot allowed a slow roll to develop at night. The crew did not notice, until the inertial-navigation system indicated FAIL, that the bank angle had exceeded 90í. The crew eventually recovered control.

As evidence of its arguments, the HF's report gives details of 24 accidents or serious incidents in which automation or its effects on pilot behaviour are judged to have been significant. They also cover Lockheed TriStars and classic 747s in which pilots had unconsciously ceded responsibility for the safe conduct of the flight to conventional autopilots.

Boeing aircraft feature six times, McDonnell Douglas three times while Airbus Industrie, a consortium born into the high-automation era, is included 13 times. The 1988 Habsheim, France air-display accident, flown by a non-display-trained pilot and the first crash to involve an Airbus A320, is listed as being caused by "possible overconfidence in the envelope-protection features of the A320".

The most common Airbus problem highlighted, however, is crew mode-confusion. In that category, the most common form of mode-confusion is mistaking vertical-speed mode for flight-path-angle mode, or vice versa. In the case of Boeing, the issues studied were more diverse, but in the "glass-cockpit" types, most often involved a combination of inappropriate mode use and mode confusion, or lack of crew situation-awareness.

Interface breakdown

The Cali crash, for example, was caused by a lack of situational awareness, according to the US National Transportation Safety Board, and judged by the HF team to be one of the classic examples of a "stark breakdown" in the pilot/automation interface. CVR readouts clearly show that the pilots did not know exactly where they were relative to the beacon at the beginning of the approach procedure for which they had accepted clearance: they had passed it but did not realise they had.

Evidence is now available from an aircraft FMS, recovered almost undamaged, that the aircraft's automatically flown turn towards the high ground, in which the pilots failed to intervene, may have been compounded by the potentially ambiguous Arinc convention for identifying navigation beacons in the aircraft's flight-director software. As American has since observed, the convention is "inconsistent with traditional charting". This caused a pilot entry of what he thought was the identifier for one of the beacons on the approach track to be recognised as another beacon 120í off heading. Because the FMS was in lateral-navigation (LNAV) mode, the autopilot was commanded to turn to it and the crew let it do so - even though the aircraft was clearly turning off-track while still descending with airbrakes out for an approach which follows a deep valley.

COMPLACENT PILOTS

The serious events listed in the HF review, however, represent only a few examples of the situations studied. The HF team also makes extensive use of pilot reports from the US aviation safety reporting system (ASRS), which often illustrates how automation can make pilots less safe. For example, says the report, because of its reliability and accuracy, automation tends to make pilots complacent; because of its complexity, it makes the aircraft more difficult to learn about and understand; because of its poor interface via the control and display unit, the FMS can make flying more work instead of less.

In the report, an expert is quoted as saying: "One of the myths about the impact of automation on human performance is that, as investment in automation increases, less investment is needed in human expertise. In fact, many sources have shown how increased automation creates new knowledge and skill requirements."

A pilot report to the ASRS, referring to dealing with an autopilot which had adopted an uncommanded climb, observes: "This is another case of learning to type 80 words a minute instead of flying the aircraft. The more automation there is in the aircraft, the harder the crew have to work to remain an active and integral part of the loop." Numerous ASRS reports quoted in the report confirm pilot concern that managing computers has often taken over from managing fundamentals such as flight-path, and that trusting flight-directors without checking raw data can lead to danger.

The fact that none of these effects was what the designers or purchasers had in mind when the automation equipment and its interface was conceived, designed or installed, is testimony to the fact that human factors in this area is a young, little-researched, little-understood science. The FAA's HF Team report calls upon "the system" to correct that situation.

.

 

 

Source: Flight International