Transforming healthcare outcomes takes more than simulation & training

Contact Our Team

For more information about how Halldale can add value to your marketing and promotional campaigns or to discuss event exhibitor and sponsorship opportunities, contact our team to find out more

 

The Americas -
holly.foster@halldale.com

Rest of World -
jeremy@halldale.com



Written by Andy Smith, MTM publisher

Please pass this on to your colleagues with a safety,quality or risk job title; and anyone you happen to know in the C Suite!

Human Factors and Healthcare

A couple of weeks ago I was privileged to attend a meetingof experts in surgical simulation. It was of excellent quality and representedthe best of healthcare; until the subject of learning from the aviationexperience was raised, for the third or fourth time.

At that point an eminently qualified surgeon made the usualhealthcare excuse for ignoring outside expertise, saying, “the human body isfar more complex than an airplane.” I suspect that many of you have heard thesame comment. It is simply designed to close the conversation. After all we maylearn something that is inconvenient for us; namely that to work safelyrequires an admission that we are all human, that humans do make errors andneed support. Above all that we all need to change our ways.

Three days later I was also privileged to attend a CAE OnboardHealthcare event demonstrating how airline procedures and human factorstraining, CRM, or crew resource management, is used to mitigate the effect ofhuman error in the airline sector.

After a study of the Tenerife air disaster of 1977, we were asked to work out what went wrong and why. We then had a short briefing, and armed with a laminated reminder sheet, three non-pilot healthcare innocents were put under some potential error inducing stress in a Full Flight Simulator.

Scheduled to land at San Fransisco, that airport was closedas we approached, requiring us to divert. While managing that, a cabin fire withinjuries was announced requiring us to divert again, yet despite us all beingin a totally alien and new environment we flew and safely landed an Airbus A320into LAX.

I doubt our Flight Instructors/Air Traffic controllers wereover impressed with the process we took to get there but it was close enough tothe instruction we had received for us to manage the situation and get theaircraft and its imaginary passengers safely onto the tarmac.

We all made errors during the ‘flight’ but acting as a true teamwe overcame them. The key difference is that the airlines expect errors to bemade (to err is human, it’s what we all do) and have designed a safetysystem to nullify their effect. Healthcare still relies on everyone getting itright all the time.

Small wonder then that the frequency of airline incidents is1 per million whilst healthcare runs at 1 to 300.

Two additional things struck me at the time. First, whenthings require a decision to be made in the cockpit there is a procedure toensure the decision is made using all the resources available; and the mostjunior member of the team speaks first. That is to ensure that the authoritygradient is managed and more than one voice is heard i.e. not only the Captain/Surgeon.Second the looks of horror on the faces of the clinicians in the room when ourtwo airline pilot instructors mentioned that all their actions andcommunication in the cockpit are recorded.

That of course is to protect them by proving that they didall that they were required to do (whatever the outcome) and had followed allcarefully thought out and tested procedures. There is no equivalent inhealthcare but there easily could be, and should be, to protect the operatingroom personnel should anything go wrong. If pilots perform ‘to the standard’then they are legally protected. If not then of course they are culpable.

The law, when it comes to healthcare, is currently a majorbarrier to improvement; it prevents promulgation of best practice and stops us fromlearning from each other.

Finally, some days ago, in another conference room during a superb briefing on IPE (Inter-Professional Education), James Reason’s Swiss Cheese Theory was raised. Only two of the audience of twenty-plus had heard of it. If you have not please go here.

It lays out the role of procedures, and the training thatsupports them in, ensuring the ‘holes in the swiss cheese’ do not align. As youmay have guessed the holes represent potential errors during a complexprocedure and if we allow them to line up the end result is catastrophe. Thistoo was featured in our pilots briefing.

Recently I visited the Clinical Human Factors Group (CHFG) website, something I have been meaning to do since a last visit in December.

CHFG is a U.K. organization formed by airline Captain Martin Bromiley after the avoidable death of his wife Elaine during routine surgery. I had seen the video at the bottom of that page before, but it tied my experiences of the last weeks together.

To close, two other pieces of airline folklore that areregularly discussed when we all meet. Both mentioned during the surgicalmeeting, the first is verbatim and the other by inference on the role ofsurgeons vis pilots.

“If you think training is expensive try paying for anaccident.”

“There are old pilots and there are bold pilots, but thereare no old bold pilots.”

Related articles



More Features

More features