When an airline makes a mistake affecting safety, there are normally others in the industry who know it has taken place, but they say nothing. If the error was unintentional and appears to be a one-off, the reaction of industry peers is usually: "There but for the grace of God go I."

But, if the same airline keeps making the same mistake - like regularly arriving at an airport with dangerously low fuel - it starts to look like a flagrant disregard for rules and procedures. What do the airline's peers do then? Often, in practice, they continue to say nothing to the authorities, even when the situation is serious.

If the organisation which knows about a repeated safety shortcoming happens also to be a supplier of goods or services to the miscreant carrier, the situation contains even more dilemmas. The contract may be valuable, and the engineer or supplier confronted with the repeated fault feels divided between his duty to report it and fear for his contract if they blow the whistle.

If the supplier reacts, the first report is normally made to the airline, to bring the situation to its attention. But what if the report has no effect? Such a state of affairs arose recently at London Heathrow Airport.

Fortunately, in the UK and some other countries, there is a safe haven for those who feel they must report a dangerous situation but want their identities kept secret. The system, in the UK's case, is the Confidential Human Factors Incident Reporting Programme (CHIRP).

The existence of CHIRP ensured that a serious risk to many hundreds of lives came to light before an accident occurred. The original report was de-identified in the CHIRP system, and then passed to the UK authorities, who contacted the government departments in the country where the airline was registered.

Why was the airline allowed to arrive at Heathrow with dangerously low fuel about a dozen times before someone in the industry blew the whistle?

The potential for a horrific accident if the aircraft was forced to go around again ought to have been enough motivation for those who knew about it to file a report. Perhaps the onus should be on suppliers themselves to report the incidents to the relevant authorities, rather than leaving it to individual employees.

The truth is that nobody loves whistleblowers, and this aversion seems to span all cultures. Those who feel that they have a duty to report safety shortcomings have to make the lonely decision on whether the sole outcome of their action will be loss of their job or their contract, while the fault goes uncorrected. The oft-quoted cliché that it is pointless and unfair to "shoot the messenger" has a hollow ring for those laden with the message. The world shoots messengers all the time.

So, using the Heathrow low arrival-fuel case as a generic example, what was the essential problem? It is not clear in detail yet, but either messages sent to the airline were not heeded, or messages were held back until the situation became serious.

The fact that it took a fully confidential reporting system to bring the matter to light is the ultimate testimony to the fact that there were no effective communication links between the two parties.

A cautious attitude towards institutionalising whistleblowing is grounded partly in the fear that employees with an axe to grind will flood the system with false or irresponsible reports. In fact, they are free to do that now and the system survives it. The essential is to stop thinking about whistleblowers, with all the connotations the word has, and recognise responsible reporters for what they are.

The international nature of the air transport industry means that there are different cultural approaches to the handling of information, and national borders can mean that communication is delayed by bureaucracy. In a world of multi-national airline alliances, all of whose airlines market a single product, hanging on to different standards for the handling and communication of safety information is not acceptable. Technical and operational information does not have a national culture. Meanwhile, good confidential reporting systems are an imperfect, but essential, safety net.

Source: Flight International