It's always a race condition.
May 20, 2008 7:55 AM   Subscribe

When programmers kill. [pdf] In 1982, Atomic Energy Canada, Limited, introduced the now-infamous Therac-25, a solely software-driven successor to its earlier medical linear accelerators. Six patients received massive amounts of radiation, and three died, before AECL was compelled to supplement the (faulty) software-only error-checking with hardware interlocks to prevent overexposure.

"Since the time of the incidents related in this paper, AECL Medical, a division of AECL, was privatized and is now called Theratronics International, Ltd. Currently, the primary business of AECL is the design and installation of nuclear reactors."

More from the sci.engr.* FAQ on Failures, Wikipedia.
posted by enn (16 comments total) 15 users marked this as a favorite
 
This is why we always mount a scratch monkey.
posted by ikkyu2 at 8:11 AM on May 20, 2008 [4 favorites]


This incident was the basis for the title of this book and was recounted there. I work with a lot of complex medical technology and this sort of failure always scares me.
posted by TedW at 8:18 AM on May 20, 2008




ikkyu2, your scratch monkey story reminded me of the potential for electrocuting patients by plugging their ECG electrodes into mains voltage.
posted by TedW at 8:33 AM on May 20, 2008


Therac-25 was among the examples in our college Ethics class in computer science. Other favorites were Mariner 1, the rocket NASA sent off-course due to transcription error, and Project Mercury, which contained the famous "DO 10 I=1.10" Fortran error in its orbit computation code. Fortunately, the latter was caught before it did any serious harm. More recently, NASA lost the Mars Global Surveyor due to computer error, as well. It's scary how much can be wrecked by a couple of incorrect memory addresses!

There's a good book on these topics: Safeware: System Safety and Computers. The author is supposedly working on a new book on the subject, too.
posted by vorfeed at 9:11 AM on May 20, 2008 [1 favorite]


One the big lessons of the Therac-25 incident, beyond the simple "software controlled systems can hurt people" is the way in which concerns, such as "Hey, this thing is burning me!" were dismissed because the computer said everything is fine. Maybe the spread of Microsoft products has helped ease this problem, but people still have this gut reaction that data coming from a computer must somehow be blessed and cannot possibly be incorrect.

Lesson for software engineers: People will trust your stuff far more than you would.
posted by LastOfHisKind at 9:35 AM on May 20, 2008 [2 favorites]


This series of incidents contains practically every element needed to scare me. If I had known about this at the peak of my radiation obsession (6th-7th grade) I probably would have cried about it at night.

Thanks for the great post, it really takes me back ;]
posted by fiercecupcake at 9:38 AM on May 20, 2008


We don't know how to program properly. Hell we don't even know what software really is much less how to build and test it and have confidence in it, at least not without going to extreme lengths.

Hell, we're still finding Big, Important bugs 25 years later.
posted by Skorgu at 10:20 AM on May 20, 2008 [2 favorites]


My mother was treated in the same room as one of the victims. I'm glad she didn't draw the black Queen.
posted by Megafly at 10:48 AM on May 20, 2008


This is one of the reasons I'm deeply afraid of nanotechnology. The idea is that the firmware will keep the nanobots under control. Lovely idea, but I've met too many programmers to believe that trick will work.

I simply cannot trust software alone in safety systems, and yet, we keep heading in that direction. There are guys who try to get it right -- but good developers cost money, testing costs money, auditing costs money, and the best way for a product to succeed is to cost less.
posted by eriko at 12:00 PM on May 20, 2008


One of the dirty little secrets of software is that programmers have been getting a free ride on performance thanks to Moore's law. Now that we're reaching the point where clock cycles are so frequent a signal can't travel from source to destination in a single clock cycle, we have to embrace parallel algorithms on a mass scale in every piece of software that requires high performance. The side effects of race conditions, deadlocks, scheduling inefficiencies, etc... should make for an interesting next decade or two in coding.

I predict the rise of functional languages, formal math techniques for logic validation, and function libraries with tightly locked functions where the source is hashed, assuring the code has not changed since the last formal review. Not that I'm up on coding anymore.
posted by BrotherCaine at 4:58 PM on May 20, 2008


I've always been a little shocked, given how many life-threatening (even incidentally life-threatening) things rely on software, that there is no professional guild of programmers, no coders code of ethics. Isn't it time the government set up a legally binding self-regulating software profession modeled on doctors, lawyers, and accountants? Why can I sue a professional engineer if he designs a building that collapses, but can't sue a software company in comparable circumstance because their contract terms are able to magically heal them of any liability or moral concern.
posted by Popular Ethics at 8:18 PM on May 20, 2008 [1 favorite]


Hey, are you in CS 295 at Stanford? I just read a paper on this last month for class.

My study group and I all came to group meeting pretty depressed that week. And then in lecture someone dropped the fact that these fools were still making nuclear reactors and everyone's eyes went like O_O.

The thing that upset me most about it was that it seemed clear that the programmer had no training in writing software and was never publicly held accountable. Talk about making coders feel like they can get away with anything. I want to believe software engineering has changed since then...
posted by crinklebat at 9:15 PM on May 20, 2008


Well, that or spend a bit more time with the data structures and algorithms books.

True. The key point I wanted to make is that lazy programmer's code will no longer get a two-fold speed increase every one or two years due to faster processors.
posted by BrotherCaine at 1:32 AM on May 21, 2008


As I remember it, the doctors running these machines were supposed to do the dosage calculations themselves, then check that they agreed with the numbers from the machine. I think several lost their medical licenses or were otherwise reprimanded for failure to do so.
posted by sindark at 11:45 AM on May 26, 2008


Actually, the incident I was thinking of is described in this document (PDF). Scan down to 'November 2000 -- National Cancer Institute, Panama City."

"At least eight patients die, while another 20 receive overdoses likely to cause significant health problems. The physicians, who were legally required to double-check the computer's calculations by hand, are indicted for murder."
posted by sindark at 11:50 AM on May 26, 2008


« Older Eurovision 2008 semi-finals are tonight   |   The Affairs Of Men Newer »


This thread has been archived and is closed to new comments