torsdag 6. desember 2018

Angle of Attack diskutert her - Curt Lewis

Thinking illusions and second guessing Angle of Attack (AOA) instruments

By Andrew McGregor[1]


Introduction

In the aftermath of a serious accident or incident, the standard ICAO investigation process promises to identify the cause so that recommendations can be issued in order to prevent a recurrence. The whole industry believes in this and pretends that the recommendations are followed. But often they are not; or if they are, it takes an inordinate amount of time to implement them. Incidents are supposed to be considered as valuable opportunities to catch problems before they cause accidents, but this seldom happens, even though the paperwork is always filled out by someone. The official accident report into the AF447 crash referred to 28 incidents involving airspeed discrepancies similar to the one that caused the crash.

While the Annex 13 process provides accident reports which are open and transparent, incident reports are not commonly available. Nobody broadcasts a tally of incident reports that are similar or related. The assumption is that 'this is the first time it happened, if it has happened elsewhere, the "all singing and dancing" perfect system would have corrected it and fixed it. Therefore, you must be the only pilot that this has happened to. But don't worry, it is always possible to learn from your mistakes and do better next time'. However, often and unknowingly, somewhere else in the world, similar incident(s) have likely happened and caught others out too. Angle of Attack incidents, are no exception. The Lion air accident is not the first crash that has been caused by faulty AOA meters. But more on that later.


Optical and Thinking illusions

Daniel Kahneman in his book 'Thinking Fast and Slow' [1] explains how we succumb to cognitive or thinking illusions, in the same way that we succumb to optical illusions. He develops this analogy scientifically.

Optical illusions occur when an observer sees something different to what is real. Lines of equal length are perceived to have different lengths; circles of equal size are seen to be different; straight lines are seen as curved. The eyes send a raw signal to the brain which is misinterpreted. The problem occurs in the brain, not the eyes.
What makes optical illusions notorious is that knowing that the perception is wrong does nothing to correct the perception; for example, the lines still look bent or curved even when a straight edge has confirmed that they are straight.

Kahneman demonstrates that cognitive or thinking illusions work just like optical illusions. Thinking illusions are pernicious, compelling and they influence us even when we know they are prone to error. Through many empirical demonstrations Kahneman and colleagues have shown that even when our minds understand rationally that our subconscious decisions may be flawed, we remain susceptible to them and often act on them regardless. The problem is accentuated by the fact that unlike with optical illusions, we have no straight edge with which to correct our thinking illusions.

There are three types of thinking illusions that cause havoc with our air accident investigation processes. These are:
The hindsight illusion. This is seeing a particular outcome as being inevitable, but only after the event. This often leads to unreasonable criticism of the accident participants and also a reluctance to modify or improve procedures to prevent it from occurring to someone else.

The skill illusion. We are better than what we really are, and so should others be also. This discourages us from putting in place procedures to correct or manage our skill inadequacies.

The Illusion of validity. This illusion relates to a work process or system of procedures in which we know consciously that the system is ineffective or not valid but we continue to believe in it -or at least act on it-regardless of this knowledge. This discourages us from improving it.

These illusions are pernicious, powerful and everyone succumbs, often to the same types of illusions. Our systems don't yet have any defence against these. It is not hard to understand how these thinking illusions stop us from correctly determining all the factors that contributed to an accident and discourage recommendations from being actioned.


Applying all this to the Angle of Attack problem

The Angle of Attack problem is particularly significant in this discussion because pilots who fly both Airbus and Boeing aircraft are not provided Angle of Attack information yet are required to make decisions as if they are fully informed. The hindsight and skill illusions suggest that in the aftermath of an AOA incident, we will naturally expect a pilot without AOA information to have made better decisions than what was reasonable under the circumstances. The illusion of validity will cause us to think that the present system without AOA readout to the pilot is good enough and does not warrant improvement, even though rational thinking suggests that it is not. It will also discourage us from looking for evidence of other incidents and examples like the Lion Air accident, which collectively support the need to provide the pilot with AOA information. Some of these are discussed below.


Other Angle of Attack incidents and accidents.

The Perpignan crash. [2] On 27 November 2008, an experienced Air New Zealand(ANZ) pilot and two experienced XL Airways pilots participated in a non-passenger demonstration flight to verify that an aircraft on lease to XL Airways, was acceptable for receiving back to ANZ. The pilots slowed the aircraft to near stall speed in order to verify the aircraft stall protection system but this failed because two of the three angle of attack instruments provided the same incorrect data without failing completely. Nor did they provide a clear warning to the pilots that they had failed or that they disagreed with the third correct unit that was outvoted. As a result the aircraft stalled and the pilots were unable to recover. The report stated that the pilots flew too close to the stall without formal flight test procedures or expertise, even though aircraft routinely fly close to the stall at cruise altitudes. The accident report stated that water from inadvertent fire hose washing likely remained in the AOA instruments and froze on climbing, causing them to read incorrectly. The accident report stated that the maximum flow rate for which the AOA meters are designed is only 7.5 l/min. This flow is less than a domestic garden hose is able to deliver. The report did not recommend that pilots be provided AOA information or a clear warning when the AOA instruments provided inconsistent information. The need to provide this information to improve the pilot's situational awareness was not mentioned.

Whether or not the causes and contributing factors were due to environmental effects, sensor defects, or sensor washings, formal compliance with JAR25.1309 required the aircraft designers to address the consequences of blockages of one, two or three sensors resulting in a false stall warning. However the official accident report did not address this shortcoming which the Perpignan accident pilots unwittingly demonstrated at the cost of their own lives. The official accident report also provided an opportunity for the designers to consider the case of an aircraft climbing with frozen AOA sensors, as occurred in the next angle of attack incident, the Bilbao incident. Regrettably this opportunity was not utilized.

The Bilbao incident. [3] On 5 November 2014, an Airbus 321 departed from Bilbao, Spain. Part way though the climb, the pilots experienced an un-commanded aggressive nose down maneuver which could only be managed with full backwards side stick. The pilots communicated with their engineering technicians who through the ACARS system, were able to determine that there was a problem with inconsistent Angle of Attack information and advised the pilots to turn off one of the Air Data Reference Units. This placed the aircraft in Alternate Law and enabled the pilots to regain improved control. The investigation concluded that two of the AOA sensors had frozen or jammed at a lower altitude earlier in the climb, eventually activating the stall protection as the aircraft climbed, and lowering the nose of the aircraft into an aggressive descent. After this incident the manufacturer promulgated recovery guidelines similar to what the pilots implemented during the incident, but the cause of the AOA instruments freezing or jamming was not determined.


Intensification and unpredictability of future weather systems

IATA [4] has expressed concern about the unpredictability of tropical weather systems at convergence zones and their potential to subject aircraft instrumentation to conditions more severe than their design case. They have no solutions. As global warming intensifies, these environmental challenges are likely to worsen, potentially exposing air data sensors to more severe conditions than the effects of a domestic garden hose.
The earlier incidents demonstrated complex AOA failure scenarios which while different in some respects to what is currently known about the Lion air crash, also have some similarities. The earlier incidents and accidents show that the recent additional guidelines following the Lion Air accident may not be sufficient to address every possible AOA-related situation that a pilot could be faced with as weather systems intensify. A clear aid that would improve a pilot's situational awareness and decision making in nearly all near-stall and AOA malfunction situations is the supply of AOA information to the pilot.

Formulation of the necessary changes needs to be based on information from less severe incidents which need to be tracked and collated. The resulting findings need to be made more available, in the same way that reports of single accidents are available and transparent. The feedback from this new process is needed to correct the illusion of validity; the illusion that compels us to believe that the present system which doesn't give the pilot access to critical AOA sensor status information, is adequate. Does all that feel a little too ambitious? Welcome to the world of thinking illusions.
 
 

[1]Andrew McGregor is a forensic engineer, commercial pilot and experienced air accident investigator. He owns and directs Prosolve Ltd, a forensic engineering practice based in Auckland New Zealand. He may be contacted at amcgreg@prosolve.co.nz

Ingen kommentarer:

Legg inn en kommentar

Merk: Bare medlemmer av denne bloggen kan legge inn en kommentar.