Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Medicine Hardware Hacking News Build

CT Scan "Reset Error" Gives 206 Patients Radiation Overdose 383

jeffb (2.718) writes "As the LA Times reports, 206 patients receiving CT scans at Cedar Sinai hospital received up to eight times the X-ray exposure doctors intended. (The FDA alert gives details about the doses involved.) A misunderstanding over an 'embedded default setting' appears to have led to the error, which occurred when the hospital 'began using a new protocol for a specialized type of scan used to diagnose strokes. Doctors believed it would provide them more useful data to analyze disruptions in the flow of blood to brain tissue.' Human-computer interaction classes from the late 1980s onward have pounded home the lesson of the Therac-25, the usability issues of which led to multiple deaths. Will we ever learn enough to make these errors truly uncommittable?"
This discussion has been archived. No new comments can be posted.

CT Scan "Reset Error" Gives 206 Patients Radiation Overdose

Comments Filter:
  • by HNS-I ( 1119771 ) on Wednesday October 14, 2009 @12:50PM (#29746531)
    While the hospital shouldn't have gone and reprogram the instructions, this should have been prevented at hardware level. The machine should register a patient checking in and the amount of radiation emitted.
  • by e2d2 ( 115622 ) on Wednesday October 14, 2009 @12:54PM (#29746577)
    When I witness this constant chase of removing risk from the world it makes me wonder if it's delusion or just plain stupidity. No matter how hard you try there will always be risk involved in almost every action. Accept it and treat it rationally. I'm not saying to ignore it. Just to accept it as life. Life is brutal.
  • by CheddarHead ( 811916 ) on Wednesday October 14, 2009 @12:58PM (#29746617)

    Along with the usability issues with the design of the Therac-25 it's obvious that the attitude of the medical staff contributed greatly to the problem. Patients complained of being burned, but their complaints were essentially ignored. Meanwhile, they were sent back for multiple treatments. Overwhelming evidence of radiation burns was ignored or given only cursory investigation because medical personal or manufacturer reps claimed that it was impossible for the Therac-25 to be responsible for the burns.

  • Film badges? (Score:3, Interesting)

    by johnny cashed ( 590023 ) on Wednesday October 14, 2009 @01:02PM (#29746673) Homepage
    Would a film badge provide a "check" to determine if the dosage is correct? One x-ray overdose is bad enough, over 200 is really uncool.
  • Feedback? (Score:5, Interesting)

    by TopSpin ( 753 ) * on Wednesday October 14, 2009 @01:10PM (#29746789) Journal

    Will we ever learn enough to make these errors truly uncommittable?"

    No. As long as correctness can't be proven and operators are permitted to create unanalyzed conditions by altering protocols there will always be risk. There are probably other mis-configured CT scanners out there in use right now that have been overdosing patients for years.

    CT scans use X-rays; an easily detected frequency of light. Why not require that scanners incorporate an independent detector that measures the amount X-ray energy? If that is possible then create an interlock that can shut down the emitter when the net energy gets out of bounds and require that any such incident be NRC reportable. If the detector excluded from alteration by the operators then software bugs, misunderstandings, etc. can be detected even years after the last engineer had contact with the system, either before harm is done or at least before hundreds of patients are literally burned.

  • by NonSequor ( 230139 ) on Wednesday October 14, 2009 @01:11PM (#29746807) Journal

    Couldn't disagree more. Unfortunately, enforcing training and reading manuals would probably have little effect. In my 10+ years doing usability for missile systems, you have to build in the mechanisms to keep the users from doing bad things. Even if you force the user to read the *entire manual* before each use, people still have bad days, hangovers, fights with significant others. It has to be designed in.

    The story behind Murphy's Law [wikipedia.org] is pretty interesting and it ties in with this design philosophy.

    Basically the story is that a technician incorrectly installed force sensors and in response, Murphy got pissed off and said "If that guy has any way of making a mistake, he will."

    However, other people adapted that statement into "If anything can go wrong, it will," expressing the idea that if a system does not mechanically exclude the possibility of human error, human error can be expected to occur. This makes accounting for human error a design constraint.

  • Programmer Oopsie! (Score:1, Interesting)

    by Anonymous Coward on Wednesday October 14, 2009 @01:45PM (#29747249)

    It's just a programming bug, it will be fixed in the next release.

    Software is licensed and may include 'defects' and customers have no choice but to accept defects.

    Maybe if these "software engineers" could be held liable for any defects, things would change.

    But hey, they are just programmers, put the bug into twiki/bugtracker, and try and fix it in the next release.

    Make sure all your easter eggs are working though.

  • by digitig ( 1056110 ) on Wednesday October 14, 2009 @02:02PM (#29747467)

    This particular error is the kind that occurs when you simplify complex procedures in the interest of widespread use. It is the fault of specialization, which we typically embrace because it allows us to leverage human labor into increasingly complex areas of inquiry. It's more than just "human oversight" or "machine failure," it's the kind of problem that typically arises when people are trained to use machines without being trained to fully understand those machines.

    A certain segment of society--that's mostly us geeks--strives against this tendency; we become technicians in various fields. But most people, including medical people, get trained by vendors to use a particular piece of software or hardware without reference to its underlying principles or inner workings. This is normal and usually beneficial for various reasons an economist could doubtless relate.

    But one of the things that we geeks should be doing is looking at equipment like this in its overall system context, which includes the operator and which includes the training the operator has received. That's mandatory in the Aviation industry pretty much worldwide (my field); I don't know what the situation is for medical equipment in the USA. No, we will never make such mistakes "uncommittable" -- perfect safety is a myth. But we should be considering possible failure modes, and the likelihood and consequences of those failure modes, to ensure that the risk is tolerable.

  • by Anonymous Coward on Wednesday October 14, 2009 @02:02PM (#29747477)

    I work on a medical device that works similarly. There is a robot, and the robot motors are connected to a separate high-power voltage source from the electronics. The high-voltage transformer is physically hard-wired to the door latch and to a big red button. So in order for the software to move the robot, the:

    - door latch must be physically closed
    - red button must physically not be pressed
    - software logic must turn on the high-voltage transformer

    It isn't software that decides if the door is closed. There's no logic that says:
          if door is closed then then MoveRobot.
    it had to be physically, electrically, wired into the door latch.

  • by jimicus ( 737525 ) on Wednesday October 14, 2009 @05:28PM (#29750265)

    I don't know about the US, but in the UK the qualification you take to give CT scans these days is usually a degree - you'd be a diagnostic radiographer. How much more training do you want?

    The problem isn't the qualification, it's the change in protocol. Someone thought it would be a good idea to override the machine's inbuilt safety cutout by resetting it part-way through the scan, proving that being highly qualified is no barrier to making dangerous decisions.

  • by digitig ( 1056110 ) on Wednesday October 14, 2009 @06:44PM (#29751013)
    Only if you accept that it will never be reached, and that there is a tradeoff in aiming for it. I'm all too used to the media and government calling for punishment for those who failed to do what could not be done, or processes getting bogged down with "protections" that will probably never protect in the lifetime of the systems they "protect". Safety is best served by realism and honesty, not by a "something must be done" attitude.
  • by GrpA ( 691294 ) on Wednesday October 14, 2009 @09:06PM (#29752161)

    Or maybe just a simple display that tells you the amount of radiation exposure that the machine is currently set for?

    Then the radiologist could take responsibility for noting it.

    This is simple and things like this often exist in development versions but are taken out later by marketting. Why?

    I once worked for an international company that had a billing system. It wasn't very user friendly and was often wrong.

    On the other hand, we had a local billing system that was accurate and helpful. At some point, bills were issued centrally and needless to say, were all wrong ( usually overbilling the customer - This was a now-disgraced US company... ) When we started to complain internally that the bills were wrong, they investigated and found we had a duplicate system that worked correctly. We were instructed to decommission it.

    The reason? Because the company didn't want the legal hassle if someone sued them for grossly inaccurate invoices and used our records against them.

    To his credit, my manager stood by us and insisted they fix the billing and said we weren't going to take down our system even when they threatened to fire him over the issue. It was a standoff for months and in the end we agreed we wouldn't monitor any other company clients that didn't know about our billing local system and we would bill legacy clients locally. Not really a satisfactory solution. The corporation won, the consumer lost and they never even knew we had a battle.

    But if you have a little radiation readout that tells you something that might highlight bugs or errors in a multimillion dollar piece of medical equipment, then wouldn't you ask the developers to remove it? After all, it's just going to be used against you if someone is killed or injured while using the equipment.

    GrpA

I find you lack of faith in the forth dithturbing. - Darse ("Darth") Vader

Working...