Abstract
Abstract. Automation surprise (AS) has often been associated with aviation safety incidents. Although numerous laboratory studies have been conducted, few data are available from routine flight operations. A survey among a representative sample of 200 Dutch airline pilots was used to determine the prevalence of AS and the severity of its consequences, and to test some of the factors leading to AS. Results show that AS is a relatively widespread phenomenon that occurs three times per year per pilot on average but rarely has serious consequences. In less than 10% of the AS cases that were reviewed, an undesired aircraft state was induced. Reportable occurrences are estimated to occur only once every 1–3 years per pilot. Factors leading to a higher prevalence of AS include less flying experience, increasing complexity of the flight control mode, and flight duty periods of over 8 hr. It is concluded that AS is a manifestation of system and interface complexity rather than cognitive errors.
References
2006). Flight operations briefing notes – Standard operating procedures – Optimum use of automation, (3rd revision). Blagnac, France: Airbus Customer Services.
. (2013). Operational use of flight path management systems, Report. Washington, DC: FAA.
. (1994). Descartes’ error: Emotion reason and the human brain. New York, NY: Grosset/Putnam.
(2012). Seneca’s error: An affective model of cognitive resistance. Doctoral dissertation. Delft, The Netherlands: TU Delft.
(2014). The duration of automation bias in a realistic setting. The International Journal of Aviation Psychology, 24(4), 287–299.
(2014). The field guide to ‘human error’ (3rd ed). Farnham, UK: Ashgate.
(2015). Exploring relationships of human-automation interaction consequences on pilots: Uncovering subsystems. Human Factors, 57, 397.
(2013). EASA Automation Policy: Bridging Design and Training Principles. Cologne, Germany: EASA.
. (2014, April 24). Regulation (EU) No. 376/2014 of the European Parliament and of the Council.
. (2016). Enhanced FAA oversight could reduce hazards associated with increased use of flight deck automation (Report number: AV-2016-013). Washington, DC: Author.
. (2014). Operator’s guide to human factors in aviation. Retrieved from http://www.skybrary.aero/index.php/Unexpected_Events_Training_(OGHFA_BN)
. (2014). Fatigue on the flight deck: The consequences of sleep loss and the benefits of napping. Accident Analysis and Prevention, 62, 309–318.
(2103). Designing clinical research: An epidemiologic approach (4th ed.). Philadelphia, PA: Lippincott Williams & Wilkins.
(2006). How we reason. Oxford, UK: Oxford University Press.
(2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
(2010).
(Understanding and analyzing human error in real-world operations . In E. S. SalasEd., Human factors in aviation (2nd ed., pp. 335–374). Burlington, MA: Academic Press.2015). Investigating flight crew recovery capabilities regarding system failures in highly automated fourth generation aircraft. Aviation Psychology and Applied Human Factors, 2015(2), 71–82. doi: 10.1027/2192-0923/a000079
(2014). Practical human factors for pilots. London, UK: Elsevier.
(2014). Human performance consequences of stages and levels of automation: An integrated meta-analysis. Human Factors, 56(3), 476–488.
(2010). Complacency and bias in human use of automation: An attentional integration. Human Factors, 52, 381.
(1984/1999). Normal accidents: Living with high risk technologies. Princeton, NJ: Princeton University Press.
(2013). “Staying ahead of the aircraft” and managing surprise in modern airliners, 5th Resilience Engineering Symposium: Mangaging Trade-Offs, Soesterberg, The Netherlands.
(1997).
(Automation surprises . In G. SalvendeyEd., Handbook of human factors and ergonomics (2nd ed., pp. 1926–1943). Hoboken, NJ: Wiley.2013). Traffic review 2013. Hoofddorp, The Netherlands.
. (1989). Human factors of advanced technology (glass cockpit) transport aircraft. Moffet Field, CA: NASA.
(2006). Joint cognitive systems: Patterns in cognitive systems engineering. Boca Raton, FL: Taylor & Francis.
(2000).
(Learning from automation surprises and going sour accidents . In N. SarterR. AmalbertiEds., Cognitive engineering in the aviation domain (pp. 327–354). Hillsdale, NJ: Lawrence Erlbaum.2010). Behind human error. Farnham, UK: Ashgate.
(