Lately I’ve been on a kick of nurturing my inner Canadian by reading a few books by notable figures from the Great White North. The first was the latest autobiography of Dr. David Suzuki (any respectable Canadian should need no explanation of who he is) and the second was a book (An Astronauts Guide To Life On Earth) written by a Canadian astronaut, Dr. Chris Hadfield, about his journey to becoming the first Canadian to execute a space walk and the first Canadian to be commander of the International Space Station. During his time as commander he used something we’re all familiar with in the FOAMed universe, Twitter and other social media forms, to engage others’ interest in space exploration.
While it was no doubt, an inspiring read, one particular point of his astronaut training at NASA stood out to me as a pharmacist. To understand that point, you have to look at adverse drug event, medication error and other medical error reporting in hospitals. If you’ve ever sat in on, or been a member of a pharmacy and therapeutics committee, or medication safety committee, you’ll be familiar with the analogy of open reporting of errors in the aviation industry to ADR/Med error reporting within a hospital. The thought is that we (as a hospital) should encourage all errors and near-miss errors to be reported in a non-punitive fashion in order to identify sentinel events before they happen, and in the event of a serious error, identify the failure in the system rather than individual finger pointing. While attempting to use this system in health care is in theory nice, it is vastly over-simplified and doesn’t work as well as the prototypical aviation industry version. It is also, inherently flawed – you have to wait for an error or near miss to occur before action is taken.
Consider now, how an astronaut trains (if you need to queue up the Apollo 13 scene where Tom Hanks is sitting in a simulation Lunar Landing Module, be my guest). Astronauts spend months, if not years, simulating every task they will likely be performing while in space. Importantly, they don’t just simulate the standard operating procedure (and do so repeatedly until they can perform the task flawlessly), they simulate such procedures in every conceivable situation/condition/malfunctioning equipment/pending alien invasion/etc. Again they sim and sim and sim and sim until they get it perfect. Dr. Hadfield used the example of a situation where if x happened during launch, he had 5 seconds to determine whether to abort and save the crews’ life of continue the mission. That’s 5 seconds to interpret a given alarm/data point, consider all options and execute the necessary procedure. There is no way this could be done if you hadn’t already simulated that (or a similar) scenario.
Now back to pharmacy. Most hospitals in the USA employ some sort of computerized physician order entry system as well as various computers charting and documenting for nurses and pharmacists. However, these systems are often fraught with incomplete order screens where potential for error increases dramatically. But rather than anticipating errors and bugs and perfecting the system before implementation, we’re constantly playing pretend aviators, trying to catch up after an error occurs. It is certainly a tremendous amount of work to perform an up front check, but it is inherently our job and responsibility to prevent errors from occurring and trying to avoid setting up others for such errors.
It may be an ideological dream, but certainly an astronauts guide to life on earth makes more sense than our current practice.