Investigators from the Netherlands have defended their inquiry into a Turkish Airlines Boeing 737-800 crash near Amsterdam, after suggestions that the final report into the accident was watered-down in response to US remarks.
The Dutch Safety Board, in its response, has published in full a human-factors analysis which contributed to the investigation and which, in particular, had highlighted concerns over a single-point failure path in the 737’s radio altimeter.
Single-point failure from an instrument, specifically the angle-of-attack sensor, has become a focus of the recent inquiries into the Lion Air and Ethiopian Airlines Boeing 737 Max accidents.
In a 20 January article, The New York Times argued that the decision not to publish the human-factors analysis into the Turkish crash – combined with the incorporation of US representatives’ remarks to the draft report – effectively deflected criticism of crucial aspects of Boeing’s design choices a decade before the Max accidents.
But the Dutch Safety Board has rejected any suggestion of yielding to pressure, insisting it is “strictly independent” in its investigative work.
While US representatives, including Boeing, made 126 separate remarks to the draft report, these comments and the investigators’ responses were fully and transparently documented in an appendix to the final report, it adds.
This appendix, running to 79 pages, also contains 53 remarks from Turkish Airlines and the Turkish civil aviation regulator, as well as comments from several other parties – including French and UK investigators, and the European Union Aviation Safety Agency.
The Dutch Safety Board states that the human-factors study was not published alongside the final report because “it was not the practice” at the time.
“Current practice is different,” it says. “The [safety board] now publishes as much as possible.”
Turkish Airlines flight TK1951 crashed during an ILS approach to Amsterdam Schiphol on 25 February 2009, after an incorrect left-hand radio altimeter reading activated an autothrottle mode which reduced engine thrust.
The aircraft was descending at the time, in order to intercept the glideslope, and this “obscured” the autothrottle’s mode switch – resulting in a speed decay, which the crew failed to notice, and a stall.
Dutch Safety Board investigators pointed out that the first officer, who was flying, was following his primary flight instruments including height measured by the right-hand radio altimeter.
But the left-hand radio altimeter, while reading incorrectly, did not categorise its reading as an error – which meant that, critically, it maintained control over various aircraft systems including the autothrottle. The crew was unaware of this and could not have known about it, the inquiry stated.
The human-factors analysis states that the crew, which had switched to the right-hand autopilot and right-hand flight-control computer, “would have believed that they had protected their aircraft” from any problems with the left-hand radio altimeter.
“What is not in Boeing 737 documentation and training available to pilots is that the autothrottle always gets its height information from the left radio altimeter,” it says.
“The knowledge available through training and pilot documentation is so underspecified that it in fact can create a false or buggy mental model about the inter-relationships between the various automated systems and their sensor input.”
In its final report the Dutch Safety Board did reference this peculiarity of design on the aircraft involved in the crash.
“This is a relic from the Boeing 737, certificated long ago, which in the original design prioritised the provision of information to the [captain, seated on the left side],” it stated.
“It is noticeable that this subject cannot be found in any of the Boeing 737 manuals or training documents for pilots. Pilots therefore do not have the correct knowledge about links between the control systems and data input for their own aircraft.”
Risks of single-point failures are highlighted by the human-factors analysis.
“The only defence against a designed-in single-failure path [is] the pilots who are warned to mistrust their machine and to stare at it harder,” it says.
“Such a reminder, oriented only at the human operator in the system, is hardly credible after three decades of in-depth research into automated airliner flying and the subtle and pervasive ways in which automation on the flight deck – and particularly its subtle failure – affects human performance.”
Crews would not have been insulated from the “automation surprise” which emerged on the Turkish Airlines, it adds, owing to an absence of sufficient training, written guidance or documentation, or line experience.
The Dutch Safety Board insists its final report “clearly” shows the “main responsibility” for the loss of the Turkish aircraft “lies with Boeing”.
It adds that the continuing investigation into the 737 Max accidents should include analysis of “whether sufficient lessons have been learned by Boeing and the US authorities” following its inquiry into the Turkish crash.