Black Box Thinking: How Aviation and Healthcare Handle Mistakes

Are mistakes and failures really covered up in healthcare?

I just read the book Black Box Thinking (Matthew Syed, 2015) on the recommendation of a friend, and it has me thinking a lot about the differences between healthcare and aviation when it comes to safety culture.

The premise of the book is that there is a fundamental difference in the attitude towards failure in aviation and healthcare; aviation has a culture that embraces failure as an opportunity to learn, whereas healthcare has a culture of covering up failure and thus does not learn from mistakes and failures.

I come from aviation, so I am quite comfortable evaluating the merits of the author’s analysis and commentary on what we do in aviation, but not so much with healthcare. So I began to wonder, are healthcare and aviation still on opposite ends of the safety spectrum when it comes to admitting mistakes and embracing failure as an opportunity to learn and improve?

For the record, it’s more a question of time than a challenge of the credibility of the author. I assume he had good intentions and is not dramatizing the situation, but the book was published in 2015 and a lot can change in a decade.

The book posits that healthcare has a culture that hides and covers up mistakes to avoid litigation prevent embarrassment and humiliation to authoritative doctors, and thus prevents learning from failure. Whereas aviation thoroughly investigates failures and publicly shares any lessons learned so the rest of aviation benefits from the new knowledge gleaned.

A few key points from the book:

  • Perhaps the difference in behavior between pilots and doctors can be explained by a difference in motivation – pilots’ mistakes kill them, doctors’ mistakes kill someone else.

  • Is the difference due to the differences in information adoption rate? According to the book, it is near real time for aviation but can be years for medicine.

  • Is it the format of information? Aviation benefits from standard and distilled accident investigation reports vs. dense and verbose medical journals in healthcare.

  • Autopsy is the medical equivalent of an aircraft’s black box, but is not systematically and universally used. In fact, at the time of publication autopsies were performed less than 10% of the time in the US. The book attributed this to the doctors’ attitude towards failure – i.e., why look for a mistake?

In Part 2: Cognitive Dissonance, Syed explores other industries and their cultural norms regarding failure.

It was enlightening to consider the legal system. Do we use wrong convictions as an opportunity to learn? Are police or prosecutors likely to be open to being wrong and putting an innocent person in jail? The brief history of using DNA testing to overturn wrongful convictions was certainly eye-opening to how resistant to being wrong an ecosystem can be.

What about politics, economics, and business? Syed observes that when the first court of appeals was established in 19th century in the UK, the strongest opponents were judges. 😲. He points to the work of Sociologist Leon Festinger on how end-of-world conspiracy theorists have a habit of shifting the dates when doomsday fails to arrive on time. And how some businesses have rapidly adopted Randomized Control Trials to test their hypothesis, but many politicians resist similar efforts at validation of policy changes.

A final note for the football fans out there – you can look forward to a rather extensive exploration into the talents of David Beckham and how his approach to failure is the main cause of his career success.

Previous
Previous

What are Human Factors? Definitions and Terms.

Next
Next

A Recipe for Disaster: A Boeing Case Study in Human Factors (Part 2)