Who’s at Fault When It Fails?
This year, a Tennessee nurse was put on trial for reckless homicide in a widely-publicized case involving a fatal medication error that killed a patient. The nurse admitted to the error, one which she had immediately reported to the hospital. But there was more to this story, and it is an abject lesson in accountability and emphasizes the issues caused by the increasing reliance on technology solutions to patient care.
The nurse in question was using an automated dispensing cabinet (ADC) programmed to accurately dispense medications. When the nurse searched on the brand name using the first letters, the cabinet did not offer the correct medication. Then, as was reportedly common practice to circumvent what was recognized as a frequent issue, the nurse performed an override and then selected the wrong medication. What the nurse did not realize is that the medication in question could only be found using the generic name. While human error was involved, technology limitations complicated the issue.
In another instance also involving an ADC, a hospital pharmacist was interrupted while inputting a medication order to be administered that evening. The software was designed to produce a list based on the first several letters of the medication. All it took was a slip of the cursor to order the wrong medication. Although the floor nurse had the correct medication indicated in the patient’s record, they deferred to the information in the dispensing cabinet, resulting in negative health consequences for the patient.
In these times of healthcare challenges and overworked staff, the idea of automating certain functions makes sense. In theory, technology should reduce the chance for human error. But ultimately, software is designed and written by humans, and technology itself is not infallible. According to one article from the National Library of Medicine, “Information technology that supports clinical decision-making doesn’t replace human activity but changes it, often in unintended or unanticipated ways. Instances of misuse and disuse – often to work around technology issues – and new sources of errors after technology implementation have been well-documented.”
Compounding the issue of the technology itself, there are the factors of “automation bias” and “automation complacency.” Automation bias was demonstrated by the nurse who gave credibility to the digitally-input information from the dispensing cabinet over the handwritten patient medication report. Automation complacency happens when we overly trust a technology solution and do not feel the need question or vigilantly monitor what it is telling us.
But when mistakes involving technology happen, who bears responsibility? In the case of the nurse on trial in Tennessee for negligent homicide, it was the nurse alone who was held accountable although investigators determined the hospital also bore significant responsibility. Reportedly, medication cabinet delays and challenges were prevalent at the hospital. As one nurse stated, “Overriding was something we did as part of our practice every day.” And, if it had not been for an anonymous tip to the Centers for Medicare and Medicaid Services and the Tennessee Department of Health, the true cause of the patient’s death may not have been revealed. The hospital failed to report the error and listed the patient’s cause of death as “natural.”
More awareness and accountability need to be part of the healthcare industry’s increasing reliance on technology to solve the challenges of caring for patients. At the heart of any method are human drivers and users, and it is crucial we learn from events like these to make medical care safer and more effective for all. If you feel you or a loved one has been negatively impacted by a medical error, contact us.