The Overdose - Harm in a Wired Hospital
April 3, 2015 7:24 AM   Subscribe


The nurses and doctors summoned to the hospital room of 16-year-old Pablo Garcia early on the morning of July 27, 2013, knew something was terribly wrong. Just past midnight, Pablo had complained of numbness and tingling all over his body. Two hours later, the tingling had grown worse.

A five part series from Backchannel at Medium.

Part One - How Medical Tech Gave a Patient a Massive Overdose

Pablo Garcia went to the hospital feeling fine. Then the hospital made him very sick.

Part Two - Beware of the Robot Pharmacist
In tech-driven medicine, alerts are so common that doctors and pharmacists learn to ignore them — at the patient’s risk.

Part Three - Why Clinicians Let Their Computers Make Mistakes
We tend to trust our computers a lot. Perhaps too much, as one hospital nurse learned the hard way.

Part Four - Should Hospitals Be More Like Airplanes?
“Alarm fatigue” at Pablo Garcia’s hospital sent him into a medical crisis. The aviation industry has faced the same problem—and solved it.

Part Five - How to Make Hospital Tech Much, Much Safer
We identified the root causes of Pablo Garcia’s 39-fold overdose — and ways to avoid them next time.

Excerpted from The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, by Robert Wachter. McGraw-Hill, 2015.
posted by ellieBOA (54 comments total) 37 users marked this as a favorite
 
Cool. I teach a medication errors discussion. I can add these in.
posted by dances_with_sneetches at 7:27 AM on April 3, 2015 [5 favorites]


Really fantastic article. I'm surprised about the default to mg/kg in the ordering system though. Shouldn't the choice between mg and mg/kg be deliberate, each and every time?


Since doses can be ordered in either milligrams or milligrams per kilogram, the computer program needs to decide which one to use as the default setting. (Of course, it could leave the unit [mg versus mg/kg] box blank, forcing the doctor to make a choice every time, which would actually require that the physician stop and think about it, but few systems do that because of the large number of additional clicks it would generate.)


Yes, it would be annoying, but it's that one extra step that would force the person placing the order to think, which is one of the problems of automation the article goes on to discuss: when a process is highly automated, we stop thinking.
posted by longdaysjourney at 7:43 AM on April 3, 2015 [1 favorite]


TLDR: someone left mg/kg selected instead of mg(total), various automated alerts were then ignored because too many alerts all the time everywhere. Interesting and important to discuss how to avoid similar things but they seem to have padded it out somewhat to make it a 5 part series.
posted by memebake at 7:48 AM on April 3, 2015 [11 favorites]


If it weren't for those pesky patient privacy laws, Epic would be very well situated to gather and analyze the data necessary to make bang-on alerts.
posted by Jpfed at 7:51 AM on April 3, 2015


There's nothing in the privacy laws to prevent Epic from analyzing de-identfied data. The problem is that since the system is somewhat customizable, every customer's data would be vastly different.

The problem in this case is the nurse. I worked in pharmacy both pre and post computerization (pharmacies were among the first departments to be computerized, long before EMRs) and there was a hard rule taught to nurses and med techs - never give a ten-fold (or more) dose without questioning it. Pediatrics and research make it a bit trickier, yes.

This is similar to the ebola story awhile ago when the ebola patient was discharged without being asked about travel to Africa. The hospital tried to blame it on Epic not being configured correctly but really, did that excuse the doctor from not *looking* for the information, which the nurse had entered correctly (Epic did not place the information on the *front page*)?
posted by diane47 at 8:11 AM on April 3, 2015 [3 favorites]


Reading this is giving me traumatic flashbacks to when a nurse increased my wife's pitocin drip from 14 to 76, (instead of 16) and my wife had like one long, continuous contraction until I ran to get the nurse and they stepped the dosage back to a sane amount and then the hospital staff tried to blame the whole thing on our doula's cell phone.
posted by rbellon at 8:34 AM on April 3, 2015 [8 favorites]


The problem is that since the system is somewhat customizable, every customer's data would be vastly different.

Admittedly, I haven't worked at Epic for years, but the codebase is so big that the underlying data store probably hasn't changed much. As far as I know, the particular data I'm thinking of doesn't vary that much in structure from one installation to the next.
posted by Jpfed at 8:38 AM on April 3, 2015 [1 favorite]


Gah, the padding that makes this a five parter with cliffhangers.

The problem is: the ordering nurse, the pharmacist, the software, the robot, and the tech.

Or it's the system. Here we have every human in the sequence operating under load, multitasking like a mofo, and in the end, stupid shit happens because every piece of this exquisite machine has been optimized for efficiency rather than safety, and every human has little to no time to understand context and be inquisitive so that they can spot the needle in a haystack.

But that hospital's never going to reduce each nurse, doctor, tech, and pharmacist's load by 10%, because not only would that cost money, but also just by costing money that would lead to patient harm, because there would be fewer patients treated, or less money to spend on other areas.

It's a complex problem.
posted by zippy at 8:43 AM on April 3, 2015 [6 favorites]


I was thinking specifically about the customizable alert settings and "alert fatigue".
posted by diane47 at 8:43 AM on April 3, 2015


I really have a hard time believing that even an untrained nurse could administer 38 pills to somebody without wondering about it.

Yes, pharmacy robot and automation, but 38 tablets is unusual. This seems much more likely to be a neglect issue passed off as an automation issue.

Article is unnecessarily long-winded too - a 5 part story that could be summed up in a few paragraphs? Why?

TL;DR - medication software has bad UI, samescreen for assigning meds by total dose or mg per kg dose. Lady enters in mg per kg when it's actually total dose, but pharmacy robot dishes it out anyway because of poor automation.

Nurse and patient get 38 pills instead of one, yet still eat them. The end.
posted by GreyboxHero at 8:43 AM on April 3, 2015 [5 favorites]


Complex system failure is always super interesting. My favorite learning is that you can either place blame OR gain useful knowledge about how to prevent future failures, but you can't do both. Of course, setting up a culture of acknowledging failures and desiring to improve the system instead of pointing blame is probably a complex system in itself.

Anyhoo, if you like this stuff, I recommend reading about nuclear criticality events. Also super fascinating, and usually involves lots of ignoring frequent alarms, too.
posted by Phredward at 8:54 AM on April 3, 2015 [13 favorites]


Computers are pedantic little bastards. They tend to do what you tell them.

This is interesting for the intersection of humanity and our blind spots. There are, roughly, 100k deaths a year where medical errors are at least contributing (at 100M procedures per year). That's about 100 times more than air traffic crashes for all causes (at 735M passengers per year). Perhaps not a great comparison, but the author started it.

So what's the difference? Every air traffic crash gets reported. Every crash gets investigated by central authorities. Those authorities can issue rules that bind all practitioners. Oh, and the practitioner is likely to die if they crash.

Given this, I recommend requiring investigation into every medical death. By a central authority with legal power to investigate, rule, and enforce. Patients should explode on death. And medical professionals should be shot when they kill someone.

Okay, okay, more seriously. Everyone in these articles gets caught up in "How do we fix our process?", when they should be asking "How do we fix our culture?". Having a pair of nurses instead of one would limit the likelihood of this type of overdose occurring. Doubling the staff of the pharmacy would help, too. Likewise, doubling the number of doctors available, and forbidding residents from being awake for 48 hours at a time.

Trying to solve this by saying "We're going to fix the computer system!" is silly. What I'm hearing is "We like how much money we make right now, how privileged doctors are, how much administrators make, so we're blaming the electronics."
posted by underflow at 9:02 AM on April 3, 2015 [20 favorites]


I'm just a baby nurse, but goddamn would I ABSOLUTELY stop and call the MD if I saw an order for 38.5 of the same pill. Particularly since that meant the dose would be adding up to OVER 6 GRAMS of meds. That is simply an absurd number.

Not to mention that simply scanning in each pill, then giving them to the patient would be a waste of my time. Even if it were otherwise correct (which it would not be, in any situation) I might call the MD just to ask them why they were writing orders designed to annoy the nursing staff.
posted by Panjandrum at 9:11 AM on April 3, 2015 [8 favorites]


Of roughly 350,000 medication orders per month, pharmacists were receiving pop-up alerts on nearly half of them. Yes, you read that right: nearly half. The physicians were alerted less frequently — in the course of a month, they received only 17,000 alerts.

Okay, I just brought up at a staff meeting how ridiculous it is that I got over 3000 pages in January. And I monitor computers, not people.
posted by pwnguin at 9:14 AM on April 3, 2015 [2 favorites]


In some respects, process is culture. The article references Toyota's process multiple times, explicitly with their "Stop the Line" ethos that supports people who do that when they "think it isn't right". Yes, the computer can only do so much, but alarm fatigue is a very real thing and there is a UI design component that does need to be addressed.

The thing I found interesting with the comparison to the airline industry was the call for a better integration of technology with the human process. And that the medical industry, which really should be at the forefront of "safety focused" industries, apparently has a blind-spot where it doesn't necessary promote a culture that's as safety-focused as it should be.

The damning thing to me was that the attending nurse didn't feel like she could double-check with someone on the unfamiliar floor without looking like an idiot. That's the cultural problem that absolutely needs to be addressed. Of all people, medical professionals should be the ones not afraid to ask questions.

The case, at least to me, is interesting because it sits at this intersection of edge-cases that seems implausible, but when you take every step into account individually (and I'm apparently in the minority because I liked the in-depth "padding" for each step that went into detail about them), suddenly it goes to a "oh, yes, I can how that minor thing happened", and then when these propagate down the line, you end up with a catastrophe.
posted by ultranos at 9:16 AM on April 3, 2015 [17 favorites]


zippy: But that hospital's never going to reduce each nurse, doctor, tech, and pharmacist's load by 10%, because not only would that cost money, but also just by costing money that would lead to patient harm, because there would be fewer patients treated, or less money to spend on other areas.

This is a good point - you have to balance this (horrible) incident against the (presumably real) benefits that the automation brings in terms of number of patients treated, speed of treatment etc. Whatever the system, mistakes can happen, trying to eliminate any possible mistake will just result in ever more complex systems. In this case, perhaps changes should be made, or perhaps you could conclude that various safeguards (especially the final nurse) failed for reasons that may never occur again.

I'd argue that our media/political systems tend towards a noisy strategy of 'change the system after every bad incident to make sure said bad incident cant happen again' ... but that isn't necessarily the right way to improve things. Sometimes you have freak accidents/occurrences.
posted by memebake at 9:20 AM on April 3, 2015 [3 favorites]


The thing about the pop ups. It's lazy design and it doesn't take into consideration the user's context. On the other hand, you can design something to death but you can't change a very old and deeply engrained culture via a UI. I know we like to just throw tech around as a solution to everything but this is one thing you can't just design & automate away. The culture of medicine needs to change and the culture of penny-pinching and just piling on inhuman quantities of work onto a few people needs to change.
posted by bleep at 9:24 AM on April 3, 2015 [2 favorites]


The article specifically addresses organizational culture. The reason the doctors and nurses don't "own up" to their mistakes-- or at least not in this article-- is that the article is about how humans gonna human and the best way to manage this isn't to introduce computers, or hunt down spurious neglect, but to reduce systemological oversights. To reduce places where the holes in the swiss cheese line up. It's a pragmatic solution. It doesn't "make sense" in the sense that we want someone to blame for this ridiculous mistake, but it makes sense when our goal is to reduce patient harm. People are always going to make dumb oversights, fail to ask questions, and doubt themselves. By analyzing the system and editing the culture we can try to nudge people into more constructive responses.

The motivations of the nurse are clearly outlined-- she's used to giving unusual doses on a research floor, the barcode approved the dosage, she didn't want to bother other nurses whose performance and non-neglect depend on not being interrupted, there were no doctors around, the patient seemed to think it was all OK, and the organizational culture didn't encourage a "Stop the Line" response to a possible error. Plus buried deep in the miasma of false positive alerts and monotony of their alertiness and an environment that trusts powerfully in expensive technology, there was no signal that anything was wrong, and powerful computerized signals that everything was right. Firing or blaming the nurse wouldn't particularly help, because people are fallible and sometimes trust the wrong cues. But changing the cues could.
posted by stoneandstar at 9:31 AM on April 3, 2015 [13 favorites]


Seems to me that the main system failure here was allowing a highly unusual prescription without requiring the doctor to explain herself.

With bad dosages caused by messy handwriting, at least someone has access to the doctor's handwriting and can see that yeah, that 72 mg might instead be a 12.

Here, all indicators of ambiguity and oversight were stripped away (e.g. the fact that there had been a dosage correction, and that 2 different people dismissed alerts related to the new dosage, without even scrolling through the alert text to read it.)

The great thing about computers is that they can store a lot of information and make it easily available for retrieval later. They don't necessarily need to pop up immediate alerts for everything. Reminds me of an overeager child who hasn't yet learned that sometimes the best way to get what you want is to chill out a bit. Time your requests wisely. Or in this system's case, save all the facts and make them available to future interested parties. If the doctor wasn't so busy dismissing alerts and grappling with a confusing computer system, she'd be more available to do a final review of medication alerts prior to their administration to the patient.

Even if the hospital has since made some marginal improvements, it sounds like this Epic system is far too sterile, in the worst sense of the word.
posted by mantecol at 9:33 AM on April 3, 2015 [1 favorite]


At around noon on a cool July day in San Francisco

Whoever teaches journalists to write irrelevant crap like this, please stop.
posted by ethnomethodologist at 9:35 AM on April 3, 2015 [6 favorites]


@zippy: "Here we have every human in the sequence operating under load, multitasking like a mofo, and in the end, stupid shit happens because every piece of this exquisite machine has been optimized for efficiency rather than safety"

I didn't get that impression at all. It mentions several times in the article that the computerized system has made dosing and delivery safer because it adds additional checks into the process (which have been useful). How do you conclude safety was sacrifice for efficiency?

What shocked me about the article is the overwhelming number of alerts. That's crazy, and whatever committee approved that software design should be fired. As a professional software developer I just have to shake my head.
posted by sbutler at 9:46 AM on April 3, 2015 [4 favorites]


At around noon on a cool July day in San Francisco

Whoever teaches journalists to write irrelevant crap like this, please stop.


Yeah, they're really gunning for the Pulitzer on that one ;)

And, about the "38 pills"... I mean, I'm sorry I actually read the article, but it seems like the pharmacy just cooked up an extra strong pill or two containing 38x the dosage he was supposed to have. He didn't get a cup full of 38 pills and swallow them all, duh.
posted by ReeMonster at 9:48 AM on April 3, 2015 [1 favorite]


It's quite clear in the article that they actually, physically gave him 38 pills.
posted by sbutler at 9:51 AM on April 3, 2015 [17 favorites]


How was he dumb enough to swallow them? I thought it said they cooked up a massive 170mg thing.
posted by ReeMonster at 9:52 AM on April 3, 2015


Not to be snarky, but maybe you should (?re-?)read the article. The kid was 16, chronically ill, on a lot of other daily medications, and doing some additional stuff for a procedure.
posted by sbutler at 9:54 AM on April 3, 2015 [2 favorites]


Ah my bad yeah they mention it in the 2nd part. Idiocy all around! Including me.
posted by ReeMonster at 9:57 AM on April 3, 2015


What shocked me about the article is the overwhelming number of alerts. That's crazy, and whatever committee approved that software design should be fired. As a professional software developer I just have to shake my head.

The design is not necessarily a problem, but the specific thresholds chosen for the alerts are too low (this is why I wish Epic could do some data mining to optimize its alert thresholds). Also, note that when the article talks about alert fatigue, it's not just talking about Epic; it's also talking about hardware devices.
posted by Jpfed at 10:04 AM on April 3, 2015 [1 favorite]


I've recently come out the other side of implementing an enterprise system at a university where we spent a lot of time grappling with questions about alerts, hard stops, default options, etc. The level of criticality was certainly less than in this case - no one's going to die if we screw up some information about a research grant. But not without importance, either - you could lose out on several million dollars if you screw up that research grant.

Having been through that process I'm a lot more sympathetic to the design choices (or assumptions) that led to some of these mistakes than I might have been two years ago. With hindsight it's easy to say "of course the system should stop 38x doses" (and it should!) but I can easily imagine how that question becomes a series of 10-person hours-long meetings about exactly where you draw the line, and wherever you draw it, someone is going to come down on the other side of it and get screwed. Especially if there's no hard data or best practices telling you where a sensible place to draw that line is depending on the clinical research being done at that hospital that makes high doses routine. What if it's 40x? What if it's 35x? Maybe there's a real answer, but maybe everyone's just flailing and using numbers that feel good in a fuzzy way, until something terrible like this happens and then you got your data the really hard and costly way. And maybe all the alerts individually made sense, until you're live and you actually understand that taken as a whole they're disastrous.

Six months into using our own system it's very clear to me that we made some good choices and some not-so-good ones, and I have a wishlist in to our developers and consultants of things I want us to tweak or overhaul - but they're on to the next phase of the system, and I can get their ear for small fixes but the nontrivial stuff? They'll have some time to work with me on that in the fall after the next phase rolls out but before they really dive into the phase after that. They're great people and really good at what they do but they have only so many hours in a day, and our team made some suboptimal choices that we didn't fully realize the implications of at the time, and so now we're just gonna live with those for a while until the stars align just right to get them fixed. Which is okay because it just means my inbox is overloaded with notifications, not "a child is going to go into seizures", but I can see how this stuff happens even in far more critical systems.

Which is to say this I loved this post, and I think I might have to circulate it to the rest of the team both as a point of interest, and to see if I can scare anyone into taking some time to fix some stuff sooner before any of our little interface issues combine in just the right way to add up to a giant grant-losing fiasco.
posted by Stacey at 10:18 AM on April 3, 2015 [10 favorites]


underflow: I recommend requiring investigation into every medical death. By a central authority with legal power to investigate, rule, and enforce. Patients should explode on death. And medical professionals should be shot when they kill someone.

Happy to endorse this pony. Could we get someone to code it up?
posted by RedOrGreen at 10:22 AM on April 3, 2015


Overall, I think it's incontrovertible that computerized systems have reduced errors and saved lives. (An Epic alert caught a data entry error in my son's vital statistics just this Tuesday; the nurse accidentally entered his height as 15 cm instead of 105 cm.) There are three things that really leap out at me about this story:

1) Epic has only one overdose alert, and particularly on a research floor or when dealing with critically ill patients, physicians are going to be overriding that sucker CONSTANTLY. I know new alerts have a real cost to them that has to be considered, but it sounds like one option would be to have a dual-level overdose alert system, one that says "hey just FYI that dose is larger than is common" and one that says "Uh, for realsies, that dose is WAY FUCKING HIGHER THAN IT MAYBE SHOULD BE." Or at LEAST have a different background color for when the dosing is by weight.

2) As previously mentioned, the nurse administering the medication knew something was wrong, but didn't feel confident that stopping to get confirmation was the right thing to do. The reasons she cites for her feelings are good ones, too; the only people she could have easily tagged for help were people who were themselves performing critical, time-sensitve, error prone tasks, and there was a real cost to interrupting them.

3) This whole industry is so understaffed that everybody is working right up to their theoretical limits. The nurse couldn't find anyone to ask for confirmation because there is no slack built into the system; the pharmacist was so interrupted writing the order that he didn't have the cognitive capacity to oversee the whole thing from stem to stern and realize that something wasn't right; the resident was trying to keep up with a huge case load and didn't have the time to carefully read the entire screen. I really feel like this is the bottom line behind this error and countless others, honestly. Executive function just gets so compromised when people are overworked and overstressed -- if someone can only get their job done by focusing on taking it one step at a time, looking at what's immediately in front of them, then you lose the opportunity to take a step back and get a big-picture view.
posted by KathrynT at 10:26 AM on April 3, 2015 [15 favorites]


diane47: The problem in this case is the nurse.

Did you not read the article? There were multiple failure points in this process. It's not helpful to blame the last link in the chain and write off the other failures as NBD.

From the article:

On July 26, 2013, Levitt was assigned a night shift, not in her usual ICU, but on a unit that was short-staffed, the general pediatrics floor. In the parlance of the hospital, she was a “floater,” and it was only the second time she had floated outside the PICU since starting her job.

The system of floating is governed by a kind of lottery — every nurse, except the most senior, is eligible. “I don’t want to float,” Levitt later told me, “because I don’t know the unit; I don’t know the nurses. Most people don’t like it.”


So you have the hospital administration's failure to staff appropriately, and the hospital administration's decision to practice unsafe floating.

To blame the individual nurse in this situation is a political decision that excuses management from accountability while scapegoating workers for management's failures.

One hopes that the nurses in such situations make management aware of their objections, thereby assigning liability to the management in the event of such a fiasco.
posted by univac at 10:33 AM on April 3, 2015 [15 favorites]


@univac -

Yes, I did read the article.

I am afraid we will have to disagree. The point where the nurse separately scanned in and separately opened 38 packets containing one pill and then gave all of them to the patient, despite her misgivings, is where the FINAL mistake was made and I certainly never indicated that the rest of the problems were no big deal. And the fact that she was floating does not change that she ignored a basic principle of medication administration. Nurses have been floating for decades and although nobody likes it, it is not considered unsafe. Certainly less unsafe than having floors understaffed.

Nurses, physicians and pharmacists are relatively well paid - far better than, say, a fast food worker making change. Yet if the way they perform their job, by doing exactly what a computer tells them to do, is the same, why do they deserve the higher wages or require the higher education? We expect medical professionals to think.
posted by diane47 at 11:04 AM on April 3, 2015 [2 favorites]


I didn't get that impression at all. It mentions several times in the article that the computerized system has made dosing and delivery safer because it adds additional checks into the process (which have been useful). How do you conclude safety was sacrifice for efficiency?

Yes, I was thinking here of staffing efficiency. Like the bit with only four people in the pharmacy, getting interrupted by something like seven unrelated inquiries while filling or signing off on a single prescription. Or the floating nurse who's chosen from a different department to work at the pediatric ICU.

The optimization here is that each person work at 100%. No redundancy, no backup, just go go go.
posted by zippy at 11:07 AM on April 3, 2015


You have a problem like the nurse had, you call in a witness to the order and the dosing. Call in a senior staff nurse or the charge, you always have to remember the mission and value statement of the organization you work for., and first do no harm.

I used to work as a unit secretary and had to transpose all orders hand written by docs, into the computer ordering system. If that was not a lame way to go about things.
posted by Oyéah at 11:12 AM on April 3, 2015 [1 favorite]


We use up all of our capacity where I work in software, and leave no slack for dealing unexpected issues. The result is sometimes missed deadlines, and sometimes deliverables that are not top-quality.

The difference is that I work in a non-critical field where mistakes are not going to kill anyone or ruin their health.
posted by mantecol at 11:16 AM on April 3, 2015 [3 favorites]


It's incredibly hard to staff a hospital ward perfectly. Even if you start a shift with the correct patient to nurse ratio, one of the patients could suddenly require a lot more care. And what if you have an admission during your shift? Even if you are staffed to handle a new admission, they are often a lot less stable and require a lot more time and a lot closer attention. Some shifts are super busy and stressful, some are less.

Actually, using a float nurse is a response to the difficulty in staffing correctly. When a nurse calls in sick, he or she needs to be replaced or you compromise patient care. You can't just send the patients home or ask them to wait until the next shift to have a nurse. Do you require someone from the previous shift to work a double to replace the sick nurse? That leads to tired nurses not at their best. Do you call in an agency nurse? That nurse will be even less familiar with the unit and hospital. Floating is incredibly common and nurses in hospitals have to be able to deal with it. If you feel uncertain because you are not familiar with the unit, you should ask MORE questions. A float would be given more slack for that, not less.
posted by diane47 at 11:34 AM on April 3, 2015 [3 favorites]


Our electronic prescription system defaults to "intra-ocular route" for all meds - how we laughed when we noticed that one. So far none of our nurses have actually ploughed ahead and tried to inject ramipril tablets into anyone's eyeball, but presumably one day somebody will have a go...
posted by tinkletown at 12:11 PM on April 3, 2015 [6 favorites]


That was an incredible article, and I liked what others have called "padding." I call it "how the sausage gets made" and I love learning those nitty-gritty details. What a tragedy of errors at every step. Systems, technology, culture, human psychology - all the holes in the swiss cheese lined up.
posted by arcticwoman at 12:13 PM on April 3, 2015 [1 favorite]


The article reminded me a lot of Atul Gawande's book The Checklist Manifesto, it touches on many of the same points with hospital organisational struggles and the comparison to airlines.
posted by ellieBOA at 12:23 PM on April 3, 2015 [6 favorites]


I know golytely is nasty stuff, but the article really over-sells it.
posted by the uncomplicated soups of my childhood at 12:40 PM on April 3, 2015


I write software for a electronic prescription and pharmacy system in the Netherlands, and errors like this, well, it's our number one priority. And our number two through 13. After that it's the less serious errors that are still errors. In dosage, timing, interaction, contraindications, etc... And we were damn proud when we got a certification as a medical support system - first software in the Netherlands to get it.

This stuff is not easy - don't just blame one step in the entire chain of events, and if you manage or buy stuff that only has to do with one chain, do everybody a favor and always look at all the links in the chain anyway, never say "well... that part isn't my responsibility." Medication safety is foremost on my mind for every line of code I write.

That said, it's damn satisfying to hear customers telling you war stories that turned out good because of what you wrote.
posted by DreamerFi at 12:51 PM on April 3, 2015 [1 favorite]


I'm a baby nurse, and I just finished a clinical rotation at a tertiary pediatric hospital where all meds are prescribed in mg/kg, regardless of patient weight. The hospital has a computerized medical records and orders system (probably similar to Epic, but not the same system) and medications are delivered in one-pill blister packs as described in the story.

We were taught to manually calculate safe low and high doses based on our individual patient's weight, and verify that the prescribed dose is safe, before giving any medications. Nurses on the floor we were on didn't do calculations every time they gave meds like we did, but there is a sheet in each patient's paper chart where the first nurse to give a new prescription calculates safe low and high doses per patient weight and verifies that the prescribed dose is ok to give. This system made sense to me to begin with, but it makes EXTRA sense after reading this series.
posted by snorkmaiden at 2:01 PM on April 3, 2015 [4 favorites]


I really have a hard time believing that even an untrained nurse could administer 38 pills to somebody without wondering about it.

Some environments punish people so harshly for questioning, or just stonewall them, that they stop. Even environments where they should really know better. And by they I mean everyone involved.

This happens both in office situations where the copy machine burns out, and like... Aircraft maintenance, and this.
posted by emptythought at 2:15 PM on April 3, 2015 [8 favorites]


The major problem with EPIC is that it is provider driven. When a provider makes an error it is a pain to fix without just starting over. They put orders in and do not actually read them then a nurse or tech has to track them down to get it corrected unless they want to take responsibility of changing an order and not having it cosigned within the time limit. Transcription of written orders is even more of a problem.
posted by bjgeiger at 2:16 PM on April 3, 2015


Oh, and expanding on the above - if you're pharmacist, or a nurse, or a doctor, and you come to me saying "your system could be better if..." you FIRST get a hug. Next, I carefully listen to you. Your suggestion may not work, but anything that might improve the medication safety gets a "thank you so much for thinking about this!" from me...
posted by DreamerFi at 2:27 PM on April 3, 2015 [4 favorites]


It's incredibly hard to staff a hospital ward perfectly.

It seems like it should be possible to staff it at a level that doesn't require everyone working there to be going absolutely 100% flat out for the entirety of their shift, though.
posted by KathrynT at 2:59 PM on April 3, 2015 [3 favorites]


That would cost money, KathrynT.
posted by tinkletown at 3:10 PM on April 3, 2015 [2 favorites]


This was super interesting to me as last semester every student in health sciences at my school had to take a class that can be summed up as "working with other health care professionals" and "patient safety". Many parts of this class were dumb but one thing we really talked about was how to advocate for your patient and get your concerns heard.
If your concerns are not addressed the first time, then that doesn't mean you should just end it. This isn't one person's fault but one person can be the difference. Every piece (including the software) plays a part, and you can't get too comfortable with your task or the software or whatever. I think every piece of the puzzle needs some reform as it's a complicated, massive problem that's not 100% of a technical isssue or a culture issue but the confluence of both.
posted by Aranquis at 3:12 PM on April 3, 2015 [1 favorite]


A 5-part story and it still only scratches the surface w/r/t the problems facing healthcare today. After reading the comments here, I feel I should make one thing very clear: maybe the most significant problem the industry is facing is the severe shortage of workers. I'll do my best to boil it down to bullets:
- Physician shortages: It's getting harder to earn a buck. Reimbursement models (i.e. pay) for Healthcare providers is changing from fee-for-service to performance- or outcome-based. Which means it's not about how many patients you treat, pay will be tied to whether a provider can demonstrate improvement (or at least management) in health outcomes. The fact that I forget to eat well and exercise is gradually becoming a financial concern for my primary care physician. This is having an effect on the number of medical students who choose primary care or family medicine at all. In my state, 1/3 of PCPs are expected to retire in the next decade.
- Increased demand: The Boomers are getting older but living longer. The shortage of physicians is only going to get worse.

The reaction is to create more lower-level healthcare workers, which is why there is a boom in for-profit schools, but this has it's own problems - one of which is the additional gap in trust within the workforce. As roles and responsibilities continue to get more distinct, it just creates more holes and overlaps, which means it is just that much more difficult to design a technological system to suit a workflow.
posted by krippledkonscious at 4:52 PM on April 3, 2015


As I read it:

1. The system rounds to the nearest pill
2. Policy is you can't round more than 5%
3. Weight-based dosing means that you're going to round a lot more than 5%
4. If you want to put in a dose rounding >5%, the only solution to release it is to void the dose, and manually round, tricking the system into thinking that you're not rounding at all.

That's just begging for someone to screw up the dosing, since you're requiring people to go behind the system's back (the system whose job it is is to round correctly!).
posted by BungaDunga at 6:23 PM on April 3, 2015 [5 favorites]


Yea, the #1 thing i've learned in years of doing IT, is that any user workflow that involves canceling or bypassing something to "go around" the normal workflow will inevitably result in people doing stupid shit.

If people need to bypass the system to get work done, the system is broken. You need to set it up in such a way that everyone can do what they have to do without any of that, and then say "you aren't allowed to bypass this without approval from a second person/manager/me"

It seems like it should be possible to staff it at a level that doesn't require everyone working there to be going absolutely 100% flat out for the entirety of their shift, though.

A friend of mine for a while was an RN, and eventually a charge nurse. It really seems to be one of those bullshit macho if you're leaning you could be cleaning professions like a lot of foodservice and bartending. It's not just about money or labor hours, it's that everyone whose already in the system has been in it for a while. Years, maybe even decades with it being that way... and anyone who wants it to change is obviously just a lazy ass who doesn't want to work as hard as them, and you gotta earn your stripes dammit!

I'm so fucking happy i don't work one of those jobs where "i've worked 14 days straight without a day off!" or "i've worked 6 10s in a row!" is something people think is some macho cool thing to say. It DID seem to be one of those jobs though. My friend worked ridiculous hours, back to back. And she worked her way up by seemingly just working more ridiculous hours than everyone else.

Similar to medical residency, it seems like a huge pile of macho bullshit that's basically continuos hazing to see who has the biggest "balls".
posted by emptythought at 9:22 PM on April 3, 2015 [8 favorites]


Thank god, I don't work in a field that affects people's physical health or has lives at stake. But I have dealt with the problem of asking for slack in things... asking for things not to be run at 100% for 100% or 125% of the damn time. It's incredibly stressful. I don't ask for this so we can spend time reading the paper or wandering around the office, but so that when things go wrong we can have time to fix it without throwing all our systems and schedules out of slack, and when things go right we'll move to lower-priority tasks or work on getting ahead. And there's constant dismissal of this from higher-ups, and the nagging feeling or corporate aura that only lazy and incompetent people ask to put more time in the schedule.

The worst thing is, I haven't even worked in an (ostensibly) profit-driven system; for people exposed to management consultants who are always looking for "inefficiencies", or management theories that are all about "more with less" or extracting more value or labor from the same amount of employees, this must be even worse.
posted by Hypatia at 9:29 PM on April 3, 2015


Hypatia: "The worst thing is, I haven't even worked in an (ostensibly) profit-driven system; for people exposed to management consultants who are always looking for "inefficiencies", or management theories that are all about "more with less" or extracting more value or labor from the same amount of employees, this must be even worse."

Most hospitals are also (ostensibly) non-profits. They have plenty of management consultants and such.

Judging by the amount of text on Wikipedia dedicated to the big three for-profit hospital firms dedicated to fraud, I suspect for-profit executives might be too busy defrauding customers and shareholders to bother with low return strategies like efficient service.

Which is an important point to consider; when the median hospital operates at a margin of -0.7 percent, greater efficiency is the only way non-profit hospitals can hope to reverse that and still have a non-profit hospital ten years from now.
posted by pwnguin at 1:59 AM on April 4, 2015 [1 favorite]


The level of alert fatigue some EMRs induce is really hard to overstate in the context of a high workload. I click through dozens to hundreds of pop-up warnings in a day, on the high side when in the ICU. I would guestimate the yield of those warnings at less than 1%.
  • "Warning! You have already prescribed a narcotic; this second narcotic may be a duplicate!"
  • "Warning! This patient had itching to morphine; fentanyl may cross-react!"
  • "Warning! This patient had nausea to bactrim; lasix is also a sulfonamide!"
  • "Warning! This patient is taking an antihypertensive; multiple antihyptensives may cause hypotension!"
  • "Warning! The maximum daily dose of PRN potassium is higher than the daily recommended maximum!"
It's no surprise that when clicking through this constant noise people sometimes miss the rare useful medication alert. I wish that the system had a checkbox for "this alert was useful." Most of my colleagues think that the alerts are bureaucratic ass covering by higher-ups who do not have to deal with them and can pass blame for errors from the expensive EMR they bought to the reckless physicians and nurses.

The culture of safety and stop-the-line is also hard to maintain at high workloads and relies on shared understanding of what's ok. When you are cross covering (the primary physician has gone home, you answer for her until the morning) a hundred patients (some of whom are making life interesting), having your pager constantly go off for routine med questions makes you crazy. I'm pretty even keeled, but I've snapped at nurses at my limits and I've seen much worse. Of course they're in a tough spot too since MDs will get mad at bugging them over nothing (Yes, the patient may have boost instead of ensure. Please never call me with this question again) and for taking too much initiative (no, I will not cosign that benadryl order) or not alerting them to clinical changes of unclear significance.
posted by a robot made out of meat at 8:19 AM on April 5, 2015 [6 favorites]


« Older spoilers!   |   They Were Our Sisters Newer »


This thread has been archived and is closed to new comments