The denouement of the recent Hollywood blockbuster Sully is a courtroom drama with a difference. On trial is the eponymous Captain Chesley Sullenberger (Sully), based on the real pilot who saved the lives of 155 passengers and crew by successfully landing his plane on the river Hudson in January 2009 after a bird hit stalled both engines.
Sully (played by Tom Hanks) faces an air crash investigation committee. But why was a man widely recognised as an all-American hero, whose quick thinking prevented a major airline disaster, facing punishment for his actions?
Every major aviation incident is followed by a full investigation, with the aim of fixing responsibility and making recommendations for prevention in the future. Research shows that in more than 80% of cases, human error by pilots is found to have played at least some part in the mishap.
This narrative of finding blame in the actions of individual pilots is portrayed quite well in Sully. But it has also been challenged in recent times by those thinkers who argue we should look at the word as a series of complex systems.
In the film, the investigators argue that Sully had risked lives by not turning the aeroplane around and making a potentially safer and far more comfortable landing at La Guardia airport. But Captain Sully destroys their arguments by asking the crash investigators to place themselves in the reality of the situation.
He successfully demonstrates that, given the stress under which he operated and the reaction time of the human brain in an emergency, there was no possibility that he could have glided the plane back to the airport. The computer simulations of the incident had failed to take these factors into account and so they, and not the human pilot, were wrong.
There has been controversy over the veracity of the story depicted by the film. Nevertheless, the director did a good job of popularising key concepts in air crash investigations that have now been recognised widely by complex systems theorists as basic flaws in the process. Research by academics like Sydney Dekker, Eric Hollnagel and James Rasmussen has pointed to the inability of investigators to fully appreciate the conditions in which airline pilots take critical decisions during emergencies.
Nor do investigations account for the stress under which pilots operate and the pressure upon them to think creatively and deviate from standard procedures in order to save lives.
Research has also shown that crash investigators suffer from two kinds of biases that stem from judging an incident that has already occurred. The facts of the case can seem far simpler with the passage of time than they were to the pilot in the heat of the moment.
The “outcome bias” is the tendency of the investigators to label the same set of actions by a pilot as an error when there is a mishap. Whereas, in typical circumstances, the actions would have otherwise been passed off as normal. Similarly, the “hindsight bias” makes the investigators attribute far more mental and physical capacity to the pilots to handle an emergency than is humanly possible at the time.
The argument isn’t that pilots don’t make mistakes, or that their failure to follow procedure isn’t a factor in the subsequent accidents. But in complex systems we should recognise that pilots involved in accidents aren’t necessarily bad apples. It is similar to the way we often blame crimes solely on individuals when there are also situational factors at work that encourage rule-breaking.
Pilots who are involved in mishaps are not necessarily bad at their jobs. Instead, they often use their past experience to bypass rules, and indulge in a rational decision-making process (however flawed) to respond to an emergency that is a sudden, novel and stressful situation for them.
The way forward for air crash investigators is to study how airline pilots operate in normal working conditions. This approach is currently being considered in the study of standard operating procedures used by train drivers.
Investigators need to appreciate the stress pilots find themselves under, and how they regularly deviate from standard procedures to fly the plane safely in a dynamic work environment. A more realistic understanding of how pilots work will help develop more representative computer simulations and more effective recommendations to prevent accidents in the future.
An interesting question to raise at this point would be: what would have happened if the landing on river Hudson had ended in disaster for Captain Sully? Probably the investigation would have concluded that he had enough time to return to the airport, and his derring-do to go for the river had cost 155 lives. Clint Eastwood wouldn’t have made it into a movie, that’s for sure.