“Ladies and gentlemen this is your captain speaking. I doubt any of you noticed, but when we departed a few moments ago, we actually started the take off from the wrong point on the runway. It was a twitchy moment for us both up here, but we’re now back on track.”
If a pilot was required to inform you of near-miss events, do you think it would instill more confidence in passengers? Probably not, but the government is to introduce legislation to force doctors and nurses in the NHS to do just this under a new “duty of candour” in the wake of the Mid-Staffordshire hospital scandal, following the model of the airline industry.
The duty would require staff to disclose information about where poor care has resulted in death or serious injury. The government also announced further consultation over whether lack of disclosure could lead to legal action.
Whistle-blowing is a sore point for the NHS, not only because of recent scandals like Mid-Staffs, but also because of claims of victimisation by those who have reported safety concerns. More transparency is welcome but a willingness to report is as much about the culture within the NHS.
In addition, a new criminal offence of “wilful neglect” under which doctors and nurses could face up to five years in prison could prevent staff reporting what they see as a mistake by a colleague, for example, but which could be treated as neglect.
Improving safety, aviation style
Health Secretary Jeremy Hunt wants to model this approach on the airline industry, which he also name-checked in June. Accidents in the airline industry were thoroughly investigated and treated as things that should never happen, he said.
It’s disappointing that Hunt hasn’t taken the opportunity to go further by leap-frogging aviation to create a state of the art safety culture for the NHS. Aviation safety is good, but certainly not perfect and is itself held back by the same threat of legal action that the government is now also proposing for the NHS.
Commercial aviation in the UK has an excellent safety record. This doesn’t happen by accident. Far from it. Airlines go to huge efforts to ensure their passengers arrive safely and are all too aware that there’s no room for complacency.
Safety has vastly improved over the years by learning from accidents that were relatively more frequent in the 60s, 70s and 80s compared to today. Each accident was an opportunity to make the aviation system safer, but with fewer accidents today, how do we keep learning?
The answer is we look at near-misses and smaller things that have gone wrong. In some cases near-misses are investigated as thoroughly as a catastrophic accident would be. The aim is to wring every last drop of information that could help us prevent bigger problems.
In many cases we simply rely on people telling us when these things have happened. Take the example of the two pilots above: an investigation may recommend airport signs are improved, maps are drawn more clearly and, perhaps most importantly, other pilots are warned of the hazard.
Whether the pilots choose to file a report is very much dependent on the culture within the airline, even though it might be compulsory. If pilots think they’ll be sacked, demoted or even prosecuted, they more probably wouldn’t. The airline industry has gone to great lengths to create a culture where people feel comfortable reporting their own errors, but disclaimers that say “the company does not condone wilful misconduct or acts involving gross negligence and reserves the right to take action in these cases” hold the industry back.
Damned if they do, damned if they don’t
Most people go to work to do a good job and that includes pilots, nurses and doctors. Sabotage is rare. But everyone makes mistakes and it’s likely that the ones we hear about are just the tip of the iceberg. Sadly, the potential to learn from all these mistakes is curtailed by that threat of consequences lurking in the background.
The next big step change in safety will come about if we can bring ourselves to remove that threat. It might seem unpalatable to let people off when they have performed way below expectations, but it might just unlock a treasure trove of information that we can learn from to make things safer. This applies not just to aviation, but also the NHS and any other organisation involved in activities where safety is critical. Let the existing criminal system deal with those extreme acts of sabotage, manslaughter or bodily harm.
The label of “neglect” will be all too easy to apply with the benefit of hindsight. When you are looking back at an event which you know had a bad outcome, your judgement of people’s actions in the run up is coloured by the fact you know the outcome was bad. So it can be irresistible to label an act as neglectful, rather than digging deeper and trying to understand in more detail why the event happened. Were they trained properly? Were enough staff on duty? Were they well rested? Did they have the right equipment? Are we recruiting the right people?
It’s from questions like these that we can learn the most. Prosecuting people might satisfy our desire for vengeance, but it often limits our potential to learn and improve safety. Here was an opportunity to craft a culture where all hazards, mistakes and errors are reported without fear of recrimination; an opportunity to turn the NHS into an organisation that learns from all of its mistakes. Surely that would be in everyone’s best interest.