2018

David Learmount/LONDDON

Pilots and maintenance engineers not only make mistakes, some often break the rules, a fact confirmed by studies on human performance presented at a seminar held by the UK Royal Aeronautical Society (RAeS) on 24 September, in London.

The aim of the seminar, entitled Professionals performing poorly, was to understand what motivates people to ignore procedures. At the event, NASA's Ames Research Center quoted a recent study of human error in maintenance which found that 60% of errors were related to procedures and 25% to practices. This echoes findings by Boeing, which in 1994 analysed all the fully researched airline hull-loss accidents from 1982 to 1991 and concluded that more of the crashes studied could have been prevented by pilot adherence to published procedures than by any other "accident prevention strategy".

Training, whether initial or recurrent, is one of the more traditional ways to reduce mistakes, by setting out to instil the disciplines that make rulebreaking, or disregard of standard operating procedures (SOPs), less likely.

The NASA study quotes Albert Einstein: "Problems cannot be solved at the same level of consciousness that created them." So, if Einstein was right, training has to do more than impart skills and knowledge: it has to create a level of consciousness which enables the trainee to recognise problems met in the real working environment.

At the same time, training has to avoid inculcating attitudes which are problem creators themselves. Examples quoted at the RAeS seminar include encouraging a "can-do" attitude, or a belief that speed in completing tasks is, in itself, a good thing.

Among the traditional methods for preventing mistakes is punishment. This implies that safety could be improved simply by training pilots and mechanics more thoroughly and enforcing procedures more ruthlessly. Since that has not always worked in the past, perhaps it is time to examine both traditional training processes, and the psychology of deterrence.

The consensus at the RAeS seminar was that it is no longer good enough to identify rulebreaking - or even unintentional mistakes - as an accident cause, then consider the investigation complete. The dominant theme which emerged from the meeting was the need to identify the reasons why pilots and engineers sometimes knowingly deviate from rules or standard operating procedures.

MISTAKE UNDER PRESSURE

The conclusions are similar to those reached at the Human factors in maintenance symposium held at London Gatwick earlier this year.

There can be more than one reason for a failure to adhere to procedures: ignorance of the procedure and "forced" mistakes made under pressure are just two. Pressure can take many forms, from an imminent threat to survival to the demands of a deadline for job completion. A "can do, come what may" attitude can be equally dangerous in an airliner cockpit or in a maintenance hangar.

Gerry Evans of the Association of Licensed Aircraft Engineers told the Gatwick conference that, before engineers are granted authorisation powers, they need special training. employers, however, "-give to technically suitable persons a type course, which in today's world is conducted in the shortest possible time", said Evans. This, he says, poses the question: "Just how do we train our engineers? For a start we never train them to say 'No'." He explained further: "The granting of authorisation is a major change in an engineer's life, but how many are really prepared for such responsibility? The only information imparted on type courses is literal, a straight reproduction of the facts on which to be examined. The thinking process required to investigate defects is not addressed-and the proper use of manuals is left to the student to work at. Authorisation is then granted and from then on the individual is, in the main, left to develop himself. He becomes a product of his environment, a product which is not subject to audit."

The most fundamental reason for failure to follow an SOP is ignorance. This can arise either from a lack of knowledge, or a failure to understand. Both can be the result of poorly written operating manuals, bad training, or a combination. Ignorance or confusion can also result from a surfeit of rules: "There are too many items referred to as SOP," observes a study by Roger Bennison and Andy Bodiam of the RAeS' human factors group. This, they say, leads to their effective devaluation "-so that the concept of the SOP has been lost".

A look at another transport industry's safety studies can be revealing. Railtrack, the privatised company responsible for operating and maintaining the UK's railway infrastructure, carried out a study of the reading skills required to understand various publications, then compared them with the reading age required to comprehend the Railway Rules and the Track Safety Handbook (TSH). Tabloid newspapers were found to demand a reading age of 15, 17 was enough for the Bible's Ten Commandments and 23.5 was needed for a chemistry text book. The TSH required a college graduate reading age of 19.5 and the Railway Rules required university graduate level reading skills of 22.5.

"No pilot wants to have an accident," point out Bennison and Bodiam. "Therefore, any deliberate non-compliance with procedures or regulations must be justified in the pilot's mind, and so it is this justification which should be placed under scrutiny."

It is man's nature to question and to search for justification, they argue, posing the question: "Is non-compliance simply the price we have to pay for this asset?" Flightcrews can be faced with situations which existing procedures do not cover completely. At such times the crew's technical knowledge, basic intelligence and resourcefulness are essential.

INTENTIONAL DEFIANCE

Sociologists refer to the intentional breaking of rules as "deviancy", meaning the defiance of society's norms, values and laws, according to David Huntzinger of America West Airlines. One study, Huntzinger explains, developed a thesis called "the situational control theory". In this, three motivational components for deviancy were identified:

an adequate reward; a high perceived probability of success; no expected adverse reaction from peers.

Huntzinger conducted a study in which he interviewed 10 private pilots, 10 commercial pilots and 10 airline pilots. Each was asked to relate two stories about his experience: one in which he intentionally broke a rule and one in which he considered doing so, but did not. For each event, the pilots were asked about their motivation for breaking the rule, including perceptions of risk and peer reaction.

The primary rule-breaking motivation was economic (a factor in 77% of the decisions), and included saving time or money for themselves, the company or the customer. Most of the remainder were attributed to pride, duty, or a perceived gain in experience. Where the rules were actually broken, no appreciable risk was reckoned to be present in half of the occasions, and in the remainder the chance of success was seen as high despite the risk.

There was not one single case of anticipated adverse reaction from peers, whether in the rule-broken or rule-break-considered cases. Peers included another pilot, a knowledgeable passenger, an observer or a company official.

In the cases where the pilots decided against breaking the rules, the majority thought the chances of being killed were high (24 out of 30) and seven were afraid of being caught (some pilots had more than one concern).

Huntzinger says that a clear pattern emerged: "Within the aviation system there is a wide variety of economic temptations to break the rules," he says, adding: "Company management is pushing schedule and cost; passengers demand to be on time. The pilot and other crewmembers have personal desires and timetables as well. This is the motivation."

TEMPTATION AND RISK

Faced with this temptation, Huntzinger says, the pilots use whatever information they have about the situation to estimate the probability of success of their actions. This estimate will contain some assessment of their own ability and the capability of the aircraft, given the outside environment. If they feel the chances of success are high and they are not challenged "-or, worse still, are encouraged to proceed", he maintains that they will break the rules.

Increased company surveillance is not the answer, says Huntzinger, explaining: "With the motivation in place, the potential offender will simply wait for an unmonitored opportunity". Line training can influence this, he says. Crew resource management, if well applied, will make peer challenge more likely. Slack simulator training which allows a crew to carry out a late correction to a poorly aligned approach, then "land" when a go-around would have been the better decision, "-changes your perception of the probability of success and the consequences of such an action", he says. Airline adoption of a "no-fault go-around" policy helps a great deal, Huntzinger maintains.

It appears that ab initio training has a lot more to do than simply prepare students for a world which needs knowledge and skills. Recurrent training, type training and preparation for a responsibility upgrade also appear to have blind spots at present. Finally, the tools - especially the manuals - which are used to reinforce procedures seem to be merely catalogues of information rather than books designed for human beings to work with.

Breaking the rules

David King, principal investigator of engineering accidents at the UK Air Accident Investigation Branch (AAIB), gives the following examples of the common factors in three serious incidents in the UK during the past five years which could easily have caused catastrophes:

night shift at personal physiological lows; supervisors tackling long-duration, hands-on tasks; interruptions; failure to use manuals; confusing and misleading manuals; shift handovers, poor briefing, lack of comprehensive stage sheets; time pressures; limited preplanning paperwork, equipment and spares; staff shortages; trying to cope with all challenges even if it meant departing from procedure. Why rules are ignored

According to papers presented at the RAeS seminar, many conditions lead pilots to disregard cockpit standard operating procedures (SOPs), including:

ignorance of the procedure; badly written manuals leading to failure to understand procedures or the need for them; too many rules/SOPs; tiredness or fatigue; time pressure from airline schedules, take-off slot time, crew duty time limitations, or a personal desire to "get home"; belief that departing from a procedure is "justified" either because of circumstances or because the rule is seen as unnecessary or wrong; a "can do" mentality created, particularly in pilots, during early training; economic pressure, particularly where pilots are paid only for the flying they perform, or when engineers are paid "piecework" rates.

Source: Flight International