Wednesday, October 31, 2012

The worst Series of Radiation Accidents in the History of Medical Accelerators

Article here:
http://courses.cs.vt.edu/cs3604/lib/Therac_25/Therac_5.html

This article investigates the six known radiation overexposure accidents by the Therac-25, a medical linear accelerator that used an electron beam or x-ray photons to treat patients with tumors. The Therac-25 was  created by Atomic Energy Commission Limited (AECL)and there were 11 of them in the US and Canada. Compared to its predecessor, the Therac-20, the Therac-25 was more compact and versatile.

Between 1985 and 1987, six known occurrences of radiation overexposure occurred with the Therac-25. The radiation overdoses were caused by programming bugs. The Therac-20, which had no cases of accidental radiation over exposure, had many of the same computer bugs responsible for the Therac-25 accidents. One of the design changes in developing the Therac-25 was to remove the hardware safety locks and have the safety controlled by the software to save money instead of having it secured with both hardware and software like the Therac-20. It was assumed that the Therac-20's safety software worked fine, but it was discovered that unknown numbers of lives were saved from its hardware safety locks. Extensive investigation of the Therac-25 didn't occur until after the fifth incident; the company wasn't able to reproduce the results of the previous 4 accidents at the time, so the company didn't inform the doctors operating the machines that there were reported incidents of radiation exposure.

When I originally read about the Therac-25 accidents (a few years ago and from a different article), I had to google the incident halfway through the article because I couldn't tell whether the story was fictional or not. As bioengineers, its a well known fact that people's lives could depend on products we make, and bugs and flaws in our products can compromise the health of the people who use our products. This story doesn't explicitly have 'good guys' and 'bad guys', but it does provide insight about how to (or how not to) handle situations where you might be the only person that has a reasonable doubt whether some device has a critical flaw in it or not. I think this story is powerful because the lessons learned apply to patients, doctors, lawyers, programmers, engineers, and so on. As the devices we design become more complex, it becomes more likely that there's going to be things screwing up for seemingly impossible reasons.

0 Comments:

Post a Comment

<< Home