Aviation safety: Is ‘human error’ a symptom of system failure, not the cause?

Aviation safety: Is ‘human error’ a symptom of system failure, not the cause?

Introduction – Human Factors and Human Error

 Dan Maurino’s statement that ‘Human error is the symptom of system failure, not the cause’ whilst simplistic and succinct in it nature, conveys a pervasively strong meaning in relation to Human Factors. In trying to understand the causes of why humans react the way they do and why accidents occur, it has been all too common in the past to pre-emptively assume that the cause is solely attributed to human error. Using this as the sole perspective, one risks losing sigh that human error cannot be analysed and addressed in isolation, but rather it is important to take into consideration other factors which contribute to accidents occurring. It is the ‘system’ Maurino refers to which plays a key role in trying to come to terms with these issues.

 At the core of Human Factors and human errors is the issue of human fallibility. Braithwaite (2007) states that ‘humans make errors; it is not enough to simply ban them from doing so’.  It is important to realise that humans will make mistakes and that this is inevitable. Using Human Factors allows one to understand how humans cope and react in specific situations. It also reveals the limitations of the human capabilities that exist. By having an awareness of this, it thus allows organisations and systems designers from example, to create environments and allocate resources which attempts to reduce the potential for errors.

 The purpose of this paper is to argue that using human error as the sole reason as to the cause of an accident limits the understanding of human factors. It is important to address the actual ‘system’ which can potentially provide critical indicators in understanding why accidents occur. Dekker (2002) also evaluates Maurino’s statement and elaborates that in the past the systems have been considered safe and it is human error that causes most accidents; he then presents a new perspective that it should be the system that is to be considered unsafe and that human errors are ‘symptoms of contradictions, pressures and resource limitations deeper inside the system’.  Dekker and Maurino now provide a new form of thinking; the starting point should be to address what failures exist in the system. An understanding of the system and its function and application

 

Understanding the ‘system’ and its implications on Human Factors

 The system Maurino alludes to refers in a broad perspective the environment in we live and work. The environment itself can be further defined as being both the natural environment and also the work environment. It is therefore important, when studying human error, to place the ‘human’ in the environment to gain and understanding how we interact and function. By addressing the deficiencies that exist in the system in relation to how we operate in it, we can hope to achieve a more concrete understanding into how we function. This understanding in turn increases the potential to minimise human error.

 The study of Human Factors allows one to understand how humans react and behave in specific environments and workplaces. In aviation, this takes on an even more critical role. With safety and the objective to minimise the potential for accidents and to minimise risks, an understanding into how humans behave and react aids on both an operational and organisational level.

 The operational efficiencies and performance of an organisation can be greatly attributed to how they treat the issue of safety and human factors. Organisations should place importance on taking into consideration human factors when designing a work environment for their employees. Using the example of an airline, where human factors are of critical importance this mind set should permeates throughout the whole airline’s organisational and operational structure. Adopting certain strategies which have as their core focus safety and effectively managing potential risks will be the major contributor to a carrier’s overall performance and minimise the potential for accidents.

 Unfamiliarity with the operating environments can also contribute to system failures. This is clearly highlighted in the Mt Erebus DC-10 crash of Air New Zealand in 1979 (NZ History.net). Apart from lack of familiarity on behalf of the pilots with Antarctica, the pilots experienced

‘white out’ which caused further disorientation in their environment. An analysis in this light, while obviously understanding that the pilot’s lack of familiarity with flying in Antartica contributed to the accident, it also highlights a major organisational failure and deficiency. From an organisational level, Air New Zealand failed to adequately train and familiarise the crew with operations in unfamiliar conditions such as Antarctica and to address the issue of ‘white out’ in such operating conditions.

 Automation and Technology and its limitations to the human operator

 From an operational level it is important to create environments and products which make the user perform the required tasks at their optimum level and free from factors which would increase the possibility of limiting their capabilities. Examples of this would be to try and avoid any unnecessary obstacles and stress associated with the human operator. The approach recognises that human error results from organisational or workplace conditions that provide such errors. Breakdowns in safety be they major or minor, should be viewed as indicators of the overall ‘safety health’ of an organisation rather than the failure of individuals.

 In relation to automation, this should be viewed as not only facilitating the human element, but also as a means of minimising the possibility of errors. Wiener (1989) with respect to automation states that it should be ‘a design in which the benefits of the automation occur during light workload times and the burdens associated with automation occur at periods of peak workload or during safety- or time-critical operations’. Environments, systems and workplaces which have automation and technology which is not in line and does not take into consideration the interaction of Human Factors has the potential to nurture and foster unsafe practices and procedures.

 It is equally important to consider the technical aptitude and ability of the human when designing systems, procedures and products with a high reliance on automation and technology. Airlines have responded by placing increased emphasis on selecting pilots technically (Helmrich, 1997). It is therefore of no benefit to implement these technologies if from the end-user side the ability is to utilise them is non-existent.

 The Aeroflot Airbus A310 crash in 1994 highlights what happens when there is a lack of understanding about automation and operational function occur. Aside from the fact that the pilot’s children should not have on the flight deck, one of the contributing factors of the crash was the flight crew’s lack of familiarisation with the A310’s automation. The technology and procedures of the A310 were largely incongruous with those of the Soviet aircraft the crew were accustomed to. The cause cannot simply and conveniently be attributed to human/pilot error, but rather it the cause is a consequence of many factors, particularly on an organisational level, since the crew were not adequately trained on stall procedures for the A310.   The ICAO Human Factors Training Manual (1998) states that the ‘human must be informed about the conduct of the operation’ so as to make the right operational decisions.

 Sarter and Woods (1994 / 1995) suggest that design-related factors may contribute to difficulties, which  can  affect  pilots’  situation  awareness.  In  the  case  of  the  Aeroflot  incident  the  crew unaware of key design differences between the western built aircraft and the Soviet built aircraft and their applications in an emergency situation.

 Design and Human Factors

 In terms of designing or enhancing products, the designers may at times lose focus of the ‘end user’. It may prove pointless to design a highly sophisticated and technologically advanced product if the human is unable to adequately understand or correctly operate such a product. It is important to comprehend the human capability limitations and design products and systems commensurate with such limitations. Sarter and Woods (1994/1995) further elaborates on this issue by arguing that Human Factors are overlooked and not properly addressed using cockpit design  as  an  example.  They  state  that  the  design  may  be  optimum  from  an  engineering perspective but it may be operationally unsatisfactory from an ‘end user’ perspective.

 Chircu and Kauffman (2001) on the hand suggest that technology and automation itself has limitations and requires the input of the human operator. They use the example of an airline reservation system to inform that such systems may ultimately manifest their limits and at some stage require ‘human’ intervention in order to effectively process and manage the booking process.

  The challenge for designers is to try and obtain a balance in their design between automation and how the human will operate it. The level of automation needs to be commensurate with the technological aptitude of the use and also be designed in such a manner as to minimise the potential for accidents to occur as a consequence of human error through its use.

 Aircraft accident investigations and human factors

 A key function of the aircraft investigator is to understand why humans behave the way they do. An investigator will need to move just beyond the evidence at hand and take into consideration the interactions and complexities which played a part in the accident. The investigator should use the evidence as a starting platform and examine the reason(s) why the accident occurred through the perspective of Human Factors and its functions. These functions need to be addressed with the outlook of automation and technology in line with an understanding of the limitations of human capabilities that exist when confronted with these technologies.

 The aircraft accident investigation of the de Havilland Comet crashes (Stewart 1986/1989) highlight how a lack of understanding in new technologies can have fatal consequences. The new technology of a pressurised cabin failed to consider that design modifications to the windows would need to be made. The stresses on the sharp ‘square like’ windows were not compatible with the pressurised cabin therefore compromising the integrity of the aircrafts ‘skin’. Aircraft which came out at a similar time such as the 707 and DC-8 featured oval shaped windows.

 The ICAO Human Factors Training Manual (1998) states that a large number of accidents result from deficiencies in the application of Human Factors. As such the data and evidence that is obtained   from  an   accident   investigation   may   prove   valuable   into   understanding   these deficiencies and therefore using this data and evidence should be considered as important and relevant to the aircraft accident investigator.

  Determinations and Conclusion

 The evidence from many accidents suggests that human error has arguably been attributed to being the causal factor to accidents, however the research points to the cause being attributed to a failure in the environment the human operates in, such as a failure in the inability to use a technological application (ICAO, 1998). This clearly highlights the investigator needs to be able

 

to effectively distinguish between the Human Factors and what is considered human errors. Such errors can be attributed to failures in communication, teamwork and decision making rather than from technical shortcomings (Helmrich, 1997).

 One now is in a position to appreciate that Human Factors has a significant degree of reliance and dependence on systems, technological functions, organisational structure and the general environment in which they exist. These features do not function in isolation, but rather there is a level of co-dependency in the nature of their existence and within the context of Human Factors. Such functions have a great influence on Human Factors and as such the investigator needs to take these into consideration when reviewing and analysing the evidence.

Returning back to Dan Maurino’s statement, it is now evident that the causes of human error are complex. In trying to gain an understanding of their functions, it is imperative that these are studied and analysed within the context of the ‘system’.

 

 

AHB
ADMINISTRATOR
PROFILE

Posts Carousel

Leave a Comment

You must be logged in to post a comment.

Latest Posts

Top Authors

  • AHB
    ADMINISTRATOR

Most Commented

Featured Videos

Instagram