Industry NewsIllusions of safety
16 April 2015
'Copy and paste' of written technical material does not equate to a safe system.
Dr ROB HUNTER Head of Flight Safety, British Airline Pilots Association (BALPA), comments on the rise and fall of safety management systems.
One of the most significant changes in the management of risk in the aviation industry is the increasing reliance on safety management systems (SMS). In their elemental form, these ‘systems’ consist of a tailored risk assessment undertaken by the organisation that generates the risk. This assessment relies on the identification of hazards and then the gathering of, and interpretation of risk data. Mitigations for the risks identified are put in place, so that a more or less, defined level of safety is maintained. Hence, SMS is a kind of over-engineered common sense. The regulatory oversight of a SMS generally involves the inspection of practices and documents held by the organisations that are taken to be the evidence that the procedures are being applied in practice. SMS — in so far as they are tailored to particular hazards — are generally contrasted with rule sets that determine what may be allowed (prescription), or may not (proscription).
All of these types of rules are often misleadingly referred to as ‘prescriptive’ regulation or even more misleadingly as ‘one-size-fits-all regulation’ as, in practice, these rules are rather more discriminating. There may be different rules for different levels of risk, such as commercial versus private aviation and so, in practice, the rules are typically ‘a-number-of-sizes-fits-all’. The number-of-sizes-fits-all approach generally has a desired level of safety that is prescribed by a body that is independent of the operator. However, the SMS approach may have a desired level of safety that is, in effect, determined by the operator; an example is the risk assessment for the overflight of conflict zones. However, there are also regulations that appear to have an independently determined level of safety but they are written in a way that is so open to interpretation, that they are, in effect, also determined by the operator. Examples are fatigue risk management rules where key terms have no precise meaning and fundamentally there is no definition of ‘how tired is too tired to fly’. It is possible that the vague language of such regulations is by intention rather than accident. Regulators may be fearful of producing rules that leave operators hamstrung for years, yet otherwise regulators have to regulate; writing rules that place a firm requirement to actively do something nebulous can seem like a good compromise. Lowest common safety denominator
Contrary to CAA statistics of two reported instances over 30 years of pilots falling asleep in the cockpit, BALPA believes that such incidents may be happening at least once every day. (Austrian Cockpit Association)
As part of the growing adoption of the SMS method; levels of safety are actually, or covertly, commonly at the discretion of the operator. One of the drivers for the move towards this concept of self-determination of risk is the bluntness of independently-described levels of safety as a safety instrument. For example, the motorway speed limit does not mean that all cars travelling at the maximum speed limit have an equivalent level of safety, because among many other factors that determine safety at speed, cars with modern braking systems have shorter stopping distances. In this regard, a better level-of-safety-based maximum speed limit might be the maximum speed at which it has been demonstrated that the vehicle can stop within, say, 300m. However, despite the fettering limitations of the independently-described safety limit, this approach taken in setting speed limits, blood alcohol limits, aircraft weight limits and so on, can be a pragmatic cost-effective approach to safety assurance.
Moreover, having the level of safety determined by the operator is not without its problems. In assessing overflight risks on 17 July last year some airlines considered it safe to fly over Eastern Ukraine, others did not. The shooting down of MH17 has thrown into stark relief the variable output of the SMS method, yet there are many more features of the SMS method that deserve our critical attention. Critical evaluation
A European Cockpit Association survey showing percentages of pilots stating that they have either fallen asleep without planning (grey) or experienced ‘micro-sleep’ episodes while on duty (red). (European Cockpit Association).
In this article I preferentially focus on some of the problems of SMS, as elsewhere these systems are heavily and largely unquestioningly promoted. SMS are here to stay and I believe that it does not serve the flight safety agenda to have the SMS arena filled with too many cheerleaders and not enough critics. To make SMS work, participants in the SMS need to be able to critically evaluate the design and operation of their SMS.
In principle, the SMS method is sound, in so far that the system has the ambition of identifying and managing all hazards appropriately. However, in practice, SMS do not generally consider that the SMS itself could be a hazard. The factors that may turn a SMS into a ‘house of cards’ generally arise from conflicting interests in the human designer/s of the SMS. Such human factors can act at individual and organisational levels in both the operator and the regulator. An individual, such as a manager, can contrive the design of the system to serve their own needs or the design can be contrived to suppress the reports of individuals who may be fearful of the consequences of their reporting action. For example, some pilots say that they are fearful of reporting fatigue because they will become embroiled in company investigations that have a quasi-disciplinary tone. It is less fatiguing to put up with fatigue than to report it. An example of the likely scale of under reporting was illustrated following a Freedom of Information (FOI) request to the Civil Aviation Authority (CAA) in 2012. The request had been to ask for the numbers of occasions on which pilots had reported involuntarily falling asleep in the cockpit; such occurrences are required in law to be reported to the CAA. The response revealed that there had been two such reports in a 30-year period. Working from models of sleepiness and knowing pilot rosters, it is likely that this actually occurs at least every day (if not every hour, indeed, in the window of circadian low, in the early hours of the UK morning, this could be happening more or less continuously). Notwithstanding the socio-political disincentives to fatigue reporting, micro-sleeps of less than two minutes generally occur without awareness and additionally drowsiness with associated performance decrement can also be without subjective awareness. At the organisational level, the fundamental conflict is between productivity and safety. Statements such as ‘safety is our number one priority’ and ‘if you think safety is expensive try having an accident’ are aimed at having us think that this conflict is unlikely to be anything more than a theoretical possibility. However, these statements warrant closer consideration because ‘trying to have an accident’ in so far as it can mean running a greater risk of having an accident, has a different meaning to ‘having an accident’. For a small airline, at current fatal accident rates, if the airline were to maintain an industry average level of safety it may not see a fatal accident for 80 years or so. Hence, if the airline CEO did think that safety was expensive and that, by reducing the airline’s spend on safety to, so to speak, try having an accident, the CEO could well find that, by halving the safety budget, the airline would still not see the attributable accident for decades, by which time the CEO would be long gone. Hence, if you think safety is expensive, you could well find that it was true and that, from the point of view of the financial survival of the airline, trying to have an accident was a great idea because it was still unlikely to actually happen, yet you get all the immediate benefits of the cost-saving. The management guru Drucker’s famous statement was: ‘The first duty of an organisation is to survive’. In this regard, claims by some operators that ‘safety is our number one priority’ may be disingenuous. If spending on safety would put an airline out of business, it is generally better to save the money today, so that tomorrow you can think about being safe. Beyond prescriptive regulations
‘You’ve got to draw the line somewhere’— a memorial to Samuel Plimsoll who campaigned in the 19th century for load lines on ships to enhance safety, against the interests of the commercial shipping industry. (Wikipedia)
So-called prescriptive regulation is frequently portrayed as being the first form of safety assurance and that the ‘new’ systems of safety management are a superior evolution in safety assurance. The part truth of this is that safety management in the aviation industry has concentrated on accidents that have occurred and on making recommendations to ensure that they do not happen again. Now that accident rates are so low, it is reasonable in order to seek further safety improvement, to concentrate on safety process which is a forte of the SMS method.
However, the effectiveness of this approach is difficult to measure and there is plenty of evidence of safety failures in SMS-rich environments. In this regard, the shift in regulatory strategy towards SMS is much more experimental than is commonly portrayed. Notwithstanding this, there are many cases in which originally-existing forms of self-managed risk assessment and mitigation, an SMS by any another name, which failed often in some very public catastrophic way, was then replaced by a number-of-sizes-fits-all regulation at the behest of government. In this way the trend towards SMS may be not an evolution but a reversion. An illustration of this is the Plimsoll load line on ships. Prior to the 1876 Merchant Shipping Act, ship owners were judged to be best placed to determine how heavily loaded their ships would be. Seamen and ship’s captains that attempted to refuse to go to sea in overloaded ships were coerced into doing so. Despite the losses of overloaded ships at sea, it was argued that safety was the paramount interest of ship owners and, on this basis, regulation was unwarranted interference. The MP, Samuel Plimsoll, campaigned against fierce commercial interest to obtain a load line on ships. At first this load line, known as the Norwood line, was to be determined by the ship owners. This self-determination of risk that could so obviously be biased by the commercial interests of the ship owners was ridiculed at the time. One ship’s captain famously sniped that he would paint the line on the funnel of his ship! It was the combination of the sustained efforts of Plimsoll, the continuing loss of merchant seamen’s lives at sea and the political pressure of public sentiment that led to the load line position being determined by an independent body. The expression “You’ve got to draw the line somewhere” was coined during the Plimsoll parliamentary debates that were extensively covered in the media of the day. Who knows best?
Who is the best person to evaluate the risk in the cockpit? (Airbus)
SMS that identify the wrong expert to design and populate the system hazards, risks and mitigations are vulnerable. Although managers are commonly held to best know the risk, this may not be the case in reality. In fact, it may be the worker in the field that has the best appreciation of a particular risk. Sometimes the person who is well placed to assess the risk may not be best placed to manage that risk. For example, in the moments before their death, drivers killed by falling asleep at the wheel generally know that they are sleepy but still continue to drive. This is because their fatigue impairs their ability to appreciate the risk. It can also be the case that the person who best knows the risk is also the most able to conceal the risk should they be so minded.
SMS have a component of board level accountability and this can be a good thing. The board are seen as the owners of the risk because they generate the risk and because they have some jeopardy for the risk. However, the board does not have as much jeopardy as the occupants of the aircraft who may be killed if the aircraft were to crash. The problem with having the risk owner (the airline board) as being someone different from the person that has the substantive jeopardy for the risk (the crew and passengers) is that it facilitates the creation of a system which is, in effect, not an SMS but a ‘BMS’ – a blame management system. This is because the principal risk for a board is not that they are killed in one of their aircraft, but whether they are blamed for someone else being killed in their aircraft. A blame management system may not have safety as its primary goal because its primary goal is the prevention of blame. Owned science
The SMS method is vulnerable to the problem of ‘owned science’. Earlier I likened SMS to ‘over-engineered common sense’. The ‘engineering’ is largely the application of scientific method to the gathering and interpretation of data. A principle of scientific work is that of peer review. This is a system which exposes conclusions to greater scrutiny and, through careful description of the methods involved, allows reproduction of the experiment and verification of findings. In situations where organisations are commissioning science to support an industrial practice of high commercial value, because they own this data, they can conceal or choose not to study what is not in their interest to expose and promote what is in their interest.
SMS may reasonably allow operators to take into account their ‘operational experience’ to support new safety practices or amend old safety practices of no proven value. However, ‘operational experience’, where it is allowed to be relied upon in regulation, is generally not defined, Rather than having some firm statistical basis, it may amount to little more than anecdote, a feeling that something has been gotten away with so far, so it must be safe. Worse still, a feeling that something has been got away with so far, so it must be too safe. The SMS method is also vulnerable to a form of reverse-engineering in which the SMS designer, having already decided a set of outcomes that are desired, contrives a process that apparently leads to an unbiased finding of the desired outcome. For example, managers that are required to provide metrics of their own performance will generally know which metrics will make them look good and which metrics will make them look bad. SMS are strongly promoted by regulators. The regulators stand to gain from the SMS approach, because the approach transfers some responsibility from the regulator to the airlines. This is potentially an important regulatory human factor. Regulators that mandate an explicit quantifiable level of safety are potentially liable if that level proves insufficient to prevent an accident. SMS can appeal to regulators because the SMS as a blame management system puts regulators at arm’s length from accidents. Further regulatory self-interest is met, in so far that there may be an overall cost reduction to the regulator if there is a move towards getting the regulated bodies to take ownership of more of the risk. In practice, the regulatory strategy for oversight can be to audit the airlines’ SMS. If this is taken to be a more process-based task, then the auditors can be administrative staff rather than more expensive technical staff. This is not to say that regulators should not seek the most economical method of regulating. Rather, it is to argue there is a potential vulnerability that this economic interest may compromise the quality of the regulatory practice. Diminishing technical resources
Even established, legacy airlines are not immune to accidents. (Canadian Transportation Safety Board)
A potential disadvantage of a shift in the balance of administrative and technical capability is that the technical resource of the regulator as an asset for the industry may diminish and the airlines may then have greater potential to mislead a less-expert regulator. Additionally, SMS, if properly executed, may place less economic burden on the regulator and more on the industry. The vulnerability is that, if an operator is financially challenged, it may produce an ‘economical’ SMS that may be no more than a copy-and-paste of written material that talks the talk but does not walk the walk of any substantive safety practice.
The uncertainty of interpretation of regulation and the ‘system’ part of safety, management can work together to belie the common sense that an SMS really is and turn it into something of such impenetrable techno-bureaucratic complexity that it becomes an area of specialisation that requires an expert. Airlines can outsource this expertise to an SMS commercial consultancy. In this regard marketable features of such a product, such as the protection of the board (the customer) from blame and the claim that the SMS can allow a greater level of productivity for a given level of safety compliance, become potentially biasing factors that undermine the intent of the SMS.
A further disadvantage is the formation of commercial bandwagons. Here the vulnerability is that the commercial providers overemphasise the need for their service such that safety resource is misappropriated within the industry, because airline managers have been persuaded that their greatest risk lies in the area promoted by the commercial band wagoneers.
Because the effectiveness of an SMS depends so much on the will of the operator, we can see how a SMS may make safe operators safer and other operators less safe. Conflicting interest is the fly in the ointment of SMS. The control of such conflicts is too often assumed to be sufficiently safeguarded by vague, easily coerced, aspirational factors such as ‘trust’ and ‘safety culture’. In general, not only might trust-based SMS not work if there are conflicting interests, they might make things much worse. If instead of policing traffic speeds, we relied on drivers’ self-reports of their speeding violations, not only might we expect drivers to not report their speeding but also that they might speed more often. SMS, if not sufficiently safeguarded against conflicting interest, can be a naïve approach that may undermine flight safety. |
fredag 1. mai 2015
Flight Safety as BALPA sees it
Abonner på:
Legg inn kommentarer (Atom)
Ingen kommentarer:
Legg inn en kommentar
Merk: Bare medlemmer av denne bloggen kan legge inn en kommentar.