The human-factors element in flight safety is now being taken seriously.

David Learmount/WARSAW

The world's flight-safety specialists have given up trying to eliminate human error. Now, the aim is to understand error and to control, or "manage" it. This strategy holds the key to improving airline flight safety, they say.

Since the late 1970s, safety-conscious airlines have been pinning their flight-safety hopes on human-factors (HF) education to reduce mistakes and thus accidents. Airlines which. have embraced and practised HF concepts such as crew-resource management (CRM) have seen improvements, but the decrease in the accident rate is still not good enough.

"Good enough" is relative. The measurable benchmark is the average number of airline accidents over recent years, which has stopped improving and appears to have "levelled out". If safety rates remain more or less the same, however, and the world fleet and air travel increases, the number of accidents annually may rise, alarming the public and damaging business.

Since aeroplanes and their engines are becoming more reliable, and because aircrew error is judged to be a factor in well over half of all serious accidents, the elimination - or at least reduction - of human error has been the goal for the last 20 years or so.

Pilot Infallibility

Things are now changing radically, however, as HF studies in the cockpit and in industry have advanced the understanding of what can realistically be expected of human beings at work. Not surprisingly, pilot infallibility has been ruled out, to be replaced by the concept of the "acceptable error".

Applied to the flight-deck, this concept could send shock-waves through a training system which has traditionally used errors during training flights and check rides as the primary direct, quantitative, measure of pilot safety and competence. Airline or aviation-authority check pilots have largely catalogued errors in procedure, technique, or the spoken word, then passed or failed the pilot on the score. Some captains made little distinction between large and small errors, or whether they had secondary effects or not; the quantity was primary.

Now, it is proposed that check pilots should stop being "error-detectors" and start to observe how crews manage the consequences of any errors they make so as to ensure that safety is not compromised. Any astute examiner should be able to recognise genuine incompetence, so that is not the issue.

CRM concepts, which shifted the emphasis to crew efficiency in carrying out the overall task safely, have already modified the "trapper" training-captain mentality, but the concept of acceptable error may require a major change in the mindset of today's average company training captain or check pilot.

If the consensus reached at October's International Air Transport Association Human Factors seminar at Warsaw, Poland, is an indicator of things to come, however, the "error-management" concept is definitely on its way into flight-deck training and checking.

Managing Mistakes

No matter whether the speakers were academics, aviation psychologists, pilots or airline safety officers, they all agree that the new aim in line-flying checks and CRM-based line-oriented flight training (LOFT) will not be error-elimination, but error-management.

Professor Robert Helmreich, director of the University of Texas' aerospace crew-research project says: "The errors are not the issue. It is whether or not they are resolved and have no consequences which matters." The head of the International Civil Aviation Organisation's (ICAO's) flight-safety and HF programme, Capt Dan Maurino, agrees, explaining it would be unnerving to be a check captain to a crew which made no errors at all in a LOFT trip or line check, because there would be no way of knowing how they would react when they did make a mistake. Mistakes are only a matter of time, even for the near-perfect pilot.

Dr James Reason of Manchester University, UK, a psychologist recognised worldwide as an authority on HF and human error, says: "The best people can make the worst mistakes. Inattention, forgetfulness, preoccupation are the last and least manageable parts of an error sequence. Managing error-producing situations is better."

Reason uses the old analogy of the best plan to achieve malaria elimination: "Don't spray the mosquitoes, drain the swamp." He says that occupational-error management has to cover four areas: the person (the operator), the task, the workplace and the organisation.

Error Containment

Most organisations, Reason says, focus on the person. "They blame and train; write another procedure; search for a missing piece of knowledge [in the person] or an inherent tendency to make an error-[however] it is better to try to eliminate error-provoking situations." Since error will still occur, however, Reason says that error-containment is an essential procedure.

To reduce the chance of error in flight decks, some airlines use standard operating procedures (SOPs) and others say that automation is the answer. Not only can these create their own problems, however, but the approach is wrong, maintains Dr Gary Klein of aviation consultants Klein Associates, who says that "-procedure is no substitute for experience".

He observes that Gandhi said that India's British rulers believed that they could create a set of laws so perfect that "-people no longer have to be good". Procedures are useful, however, not just in the form of checklists which serve to cut out errors of omission, but to help in what Helmreich calls "uncertainty avoidance", which some cultures appear to find more necessary than others.

Klein says that it is not lack of error, but total pilot situational-awareness which is the keystone of a safe flight-deck crew. Situational awareness, however, is impossible for a crew with no experience of a situation, he points out. One of the reasons why pilots struggle with computerised flight-management systems, says Klein, is that becoming familiar with all their modes and quirks takes time. Situational awareness can suffer in the meantime, and always does when automation produces a surprise.

The University of Illinois' Dr Nadine Sarter, a member of the team which produced the US Federal Aviation Administration-led international report on HF in highly-automated flight-decks (Flight International, 9-15 October, P26), chose the 1994 China Air Lines accident at Nagoya, Japan, as a classic example of how automation, through mode-confusion, can destroy situational awareness.

It was chosen also, however, as an indictment of the kind of pilot training which the ICAO's Maurino criticises as being based on 1970s concepts for a 1980s airliner (the Nagoya accident involved the Airbus A300-600).

Maurino is not the first to talk about safety being a "culture", but he attempts to define the basic components of safety culture. Safety starts as a social value based on the perceived value of human life, he explains. Organisational or corporate values are rooted in the prevailing social values. In the end, however, safety is, as Maurino puts it, "a subjective process of risk evaluation and acceptance".

There was universal agreement at the seminar that a blame-free regulatory and corporate culture is essential to any plan to upgrade airline safety. Under the reverse, a punishment-based culture, pilots keep potentially useful information about incidents to themselves, so that nothing can be learned about training or about Reason's "error-producing environment".

Maurino describes the pilot as the whole aviation system's "goalkeeper", and slams a system which, he says, provides pilot training based on 1970s principles employed in 1980s-designed aircraft flying in a 1990s environment.

Airbus operational-evaluation manager Jean-Jacques Speyer brought together the two positive concepts of a blame-free culture and the acceptable error when addressing the seminar about blame-free incident reporting. He explains: "Errors are the feedback which allows for improvement- [errors] are fertile - they are the fuel for the safety system."

Maurino puts it more dramatically, saying: "So long as we have our current reactive attitude [to flight safety], we need accidents to know when we have overstepped the boundaries." An accident, at present, is the main stimulus for safety advance, Maurino says. Apart from the moral implications of such an approach, it is not even effective, he maintains, because accident investigators today do not consider it their task to investigate in any depth the HF contributions to accidents. Until they do, Maurino insists, lessons from accidents "-will continue to reinforce areas of safety which are already very good, and fail to deal with the weaker aspects."

While praising the blame-free safety culture, Helmreich acknowledges the need for discipline and well-defined boundaries, insisting that a blame-free culture "-does not include acceptance of willful flouting of procedures".

Greater knowledge of the kind of errors which pilots make can enable appropriate change. Speyer explains that Airbus has just begun attempts to launch its Aircrew Information Reporting System (AIRS) as an airline-operated internal confidential reporting system (Flight International, 9-15 October, P8). He says that, even though Airbus provides the AIRS equipment and training in structured, key-word-based reporting, persuading most airlines to become involved will be "arduous" because it can be successful only in a no-blame airline culture. Without such a culture, pilot reports will not be forthcoming, and the airline will not be able to share the de-identified data.

Accident prevention

Boeing human-factors specialist Dr Curt Graeber, however, is sceptical of the ultimate policy-determining value of any pilot-based narrative reporting system - even the much-praised US Aviation Safety Reporting System. His scientific thesis is that, in developing accident-prevention strategies, perceived information is of less value than measured data, such as that obtained from flight-data recorders or learned from studying accidents and incidents.

Meanwhile, as the experts debate the best means of getting the industry to understand and accept the new culture, no-one argues that the faultless pilot exists. So training pilots to cope with their own mistakes as well as continuing to act as "goalkeepers" for an imperfect aviation system seems like a logical idea.

Overshooting the centre line is an error, but not dangerous. It is the action the pilot takes to rectify the situation which is safe or unsafe.

Source: Flight International