Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Intel Medicine Technology

Algorithm Brings Speedier, Safer CT Scans 58

kenekaplan writes "Standard CT scanners can generate images of patient's body in less than five minutes today, but the radiation dose can be equal to about 70 chest X-rays. Lower-powered CT scans can be used in non-emergency situations, but it can take more than four days to produce those images. Intel and GE created an algorithm that speeds up a computer's ability to process the low radiation dose scans by 100x, from 100 hours per image to one hour."
This discussion has been archived. No new comments can be posted.

Algorithm Brings Speedier, Safer CT Scans

Comments Filter:
  • by ColdWetDog ( 752185 ) on Monday March 12, 2012 @04:20PM (#39330943) Homepage

    1. I hate 'news articles' that are chock full of hyperbole and mis information. TFA implies that most CT scanning is done in the ER for life or death reasons which is hardly true. It oversells the current radiation dose of modern 16+ slice scanners and attempts to lower the radiation doses for all CTs.
    2. Current gen CT scanners cut the dose of most tests by at least half from the second and third gen scanners. Of course, TFA doesn't mention how good the new dose regimens are in terms of decreasing dose.
    3. It appears that this new tech has a significant price tag. TFA quoted 1.5 million for a 128 slice scanner with the "new algorithm". More slices = faster and more resolution, but mostly faster. The current 'top of the line' is 64 slice. "Standard" CTs are 16 slice and cost anywhere from $150 - 250K.
    4. At least the GE scanners run Linux!

    • And if this linked article [royalgazette.com] doesn't remind you of a certain Monty Python sketch, then you've ceased to be, kicked the bucket and shuffled off this mortal coil.

      I don't see how they did that blurb with a straight face.

    • by Anonymous Coward

      It was so cool back in the days to see DEC/OSF booting on an MRI tech console. Diversity of both software and hardware. SGI workstations (Indigo, perhaps, I don't remember) in another room, with funky 4Dwm windows & icons. Now almost every medical equipment I run is either Windows or Linux (like that new GE CareStation we got last week). Linux, Linux, Linux. Or Windows. Or VxWorks. This starts to get boring.

    • by meza ( 414214 )

      Did you read another article than the one I read? This is an honest question.

      1. The article in fact mention that the new technique is not applicable for life or death situation where a high radiation dose is acceptable. But rather for routine tests where it is important to limit the dose.
      2. Article doesn't mention different generations of CT
      3. No price is mentioned that I can see. I've search for "1.5", "million", neither words are used anyway
      4. No mention of linux

      Maybe the link have changed from an earlier

      • by ColdWetDog ( 752185 ) on Monday March 12, 2012 @05:03PM (#39331505) Homepage

        No, I added a few things -

        1. The article makes breathless claims about "emergency' CT scans and gives a decidedly FUD picture to the issue of radiation exposure via medical devices. It's there, just not as dramatically as mentioned.
        2. I added the different generations of CT scanners to point out that manufacturers have been cutting down dosage systematically and significantly over the past couple of decades. Again, it's really just progress.....
        3. The cost of the 128 slice "new algorithm" scanner is almost an order of magnitude more than a base gen 3 CT scan. It does things that the cheaper scanner doesn't but that's a pretty high price to pay. The info comes from a linked article in TFA (see my post below the first one).
        4. This is Slashdot. I thought somebody would appreciate this bit of technical trivia. Of course, if it ran OS X or if Google developed it, the thread would get 10 times the comments this one will get.

        Mostly I'm just grumping about stupid press releases. If they toned down the rhetoric and added some technical detail, it might be an interesting Slashdot post. As it is, it's just fluffy techno pony drivel.

        Now, if you don't mind, it's time for my nap....

      • 4. No mention of linux

        My guess is (s)he works in the medical imaging field where it's pretty common knowledge. It doesn't need to be mentioned in the article. GE scanners run Linux and ran Solaris before that.

        Maybe the link have changed from an earlier version. The article is intel.com so I assume that it's main purpose was PR, but still I thought it was pretty ok and it was clear on the improvement that was made (computation time was reduced by a factor of 100).

        I have a friend who works for Sapheneia [alpha-imaging.com] They have been doing this for several years now. Not only have they been able to do this for some time, but they also work with almost any vendors scanner.

    • "Top of the line" is now 320 rows for Toshiba, or dual-source 128 rows for Siemens. I think there are 256 detector row scanners as well.
    • Also, "faster" has a material bearing on dosage... For example, imaging doing a vascular run-off with a four-slice scanner. The narrow detector array means the patient will be bombarded with ionizing radiation for far longer than a wider detector array (higher Z-axis efficiency with more slices). And if you're doing dynamic (4D) scans (e.g., coronary functional CT scans), then getting a whole volume in one rotation is also huge and saves on radiation.
  • Technology has been making some huge leaps and bounds over the past few years, it is almost hard to believe a few years ago I was playing Ultima Online lol. This is huge for the medical field in treating and diagnosing people without adding another problem to the list. Keep these algorithms coming! Save mankind computers and prove those terminator movies wrong!
  • OpenCL || Intel add (Score:4, Interesting)

    by Massacrifice ( 249974 ) on Monday March 12, 2012 @04:36PM (#39331153)

    Sounds like a job for OpenCL. A GPU cluster would be much more scalable than using expensive Xeons. Which also makes this article sound like an add for Intel CPUs.

    • by Anonymous Coward

      Minor nit: "ad" = advertisement, "add" = addition.

    • by ndykman ( 659315 )

      Not every problem leads itself to GPGPU solutions. I'm no expert, but looking at a paper on a similar idea for optical reconstruction, I'd bet that the creation and update of the model via analysis (comparsion between predicted and actual results) is really hard to make parallel, and that process has a lot of non-localized memory access.

      I'm sure you could use GPU acceleration for the CG calculations in the reconstruction phases, but I'm not sure that's the limiting factor here.

      In short, there are really goo

    • by Macman408 ( 1308925 ) on Monday March 12, 2012 @07:24PM (#39333253)

      I think this very much is an Intel ad. I was curious, because this sounded familiar, so I looked it up. From the press release and GE's white paper [gehealthcare.com], it looks like their system:
      Uses 25 mAs dose (75% less than standard, they say)
      Is ready in an hour, 100 times faster than when they started in 2006 (so 6-10x of that speedup is Moore's Law, the other 10-16x is algorithm improvement)
      Uses 28 quad-core Xeons

      On the other hand, a GPU solution [physorg.com] from 2 years ago:
      Gives a 2-4 mAs dose (97-99% less than standard, they say)
      Is ready in 1-2 minutes, 100 times faster than contemporary CPU algorithms
      Uses a single GPU

      Better, faster, cheaper... Pick three.

    • You're right. In fact, it has already been done [acceleware.com].

  • by Anonymous Coward

    How to obtain a 100x speedup: consider an architecture with 112 cores, wait for 2 CPU generations to pass, and put 3 engineers on the task for several years to parallelize the algorithm by hand. Of course, giving it to the research community would have been impossible because, you know, it may have worked faster with more general solutions.

  • by sjames ( 1099 ) on Monday March 12, 2012 @04:45PM (#39331263) Homepage Journal

    Inquiring minds want to know, since this will substantially reduce the needed resources for a scan, how much cheaper will they be?

    You can stop laughing now!

  • by djbckr ( 673156 ) on Monday March 12, 2012 @05:16PM (#39331657)
    I had a CT scan two weeks ago. I didn't fully realize until after I was done about the amount of radiation I was exposed to. My arm where the IV was injected with radiation hurt like hell for about 18 hours and of course I now have a higher risk of cancer. There's enough radiation in the injection that it makes you feel like you're generating heat from the inside. It's quite a weird feeling. I guess it was needed for the procedure I had to have done, but here's hoping for improvements with lowered radiation exposure.
    • by Anonymous Coward

      From what I have read, you basically get a 1/400 chance of cancer over 10 years.
      Sounds like a lot/little depending on your view. My mother had 6 CTs.

    • by Anonymous Coward on Monday March 12, 2012 @05:51PM (#39332177)

      The heat you felt is really because the iodine gets to the thyroid provoking a thermal regulation change.

    • They don't inject radiation into you.

      Yes, I am a biologist.

      • by Anonymous Coward

        Perhaps the procedure described was a PET/CT scan. In that case, they do indeed "inject radiation into you":

        PET images features CT doesn't, but CT provides much better spatial resolution, which is why it's diagnostically advantageous to have simultaneous and coregistered subject imaging. The obvious way to achieve this is to build the two scanning apparatuses into the same device to provide both spatial and temporal locality. Whereas CT imaging provides its own signal (the emitter as well as the detector),

    • First, the "radiation" didnt make the burning. If you were getting enough radiation to feel a 'burn' you would be losing your hair right now. To put the numbers in perspective, you received a maximum of 1-1.5 Rem (10-15 mSv) of radiation. The average yearly background exposure not including medical is ~320 mRem/year (3.2mSv). Including medical: ~620 mRem (6.2 mSv). For a nuclear power plant worker the maximum allowed per year is 5 Rem (50 mSv). On average nuclear power plant workers get an additional 100mR
    • You seriously have no idea what the fuck you're talking about, do you. "My arm where the IV was injected with radiation..." ?! Your arm was not injected with radiation. Your arm was injected with Contrast. Were you listening to the Rad Techs at all? It went something like this "OK, I'm going to inject you with contrast now. You'll feel it spreading warmly through you. Take a deep breath...and hold it... (as the scan goes), OK, breathe."

      Oh, my qualification? I work on GE Lightspeed and Toshiba Aquil

    • A number of misunderstandings in this post and the comments:

      1. The IV injection is an iodine containing contrast, it does not contain radioisotopes / radiation. Iodine is a heavy, and thus radiopaque element and is used to show blood flow in the CT scan.
      2. The warmth you felt is due to a histamine and vasodilatory reaction to the IV contrast, it's got nothing to do with the thyroid. It's similar to the warming sensation you get when you have a couple of shots of alcohol and actually causes you to cool down.

  • Seismology uses similar model-based reconstruction algorithms. the danger is you can force the data into something that looks a lot like the input model if you are not careful. Technically this get stuck on on a false local minimum in an optimization problem.
    • On the other hand, there's quite a lot of difference betwen a hand, a head, a foot, a chest..
      If the initial model used for the reconstruction is just some generic approximative shape, it will still provide some speed up, but won't be affected that much by the actual pathology which is being imaged.

  • A while back I had several full-body CT scans on an emergency basis. They found what they were looking for in my liver and it was treated. But I was forbidden to have any X-Rays of any kind for two years after that. So when I came down with bronchitis and pneumonia, the doctor had to play it by ear (literally, he just listened to my chest). All is well, now. But lowering the X-Ray dosage of CT scans is very worthwhile.

  • Is it really a new algorithm, or is it just that they hand-tuned the code to run iterative reconstruction quickly? There's a world of difference. There are some great algorithms out there to speed up calculation of large images where you expect them to be compressible in some basis, but from this article it looks like they didn't invent a new fancy algorithm, they just heavily optimized an existing one. Anybody have a link to a technical paper so we can find out for sure?

  • This is old news: http://www.genewscenter.com/Press-Releases/GE-Healthcare-Unveils-Ultra-Low-Dose-CT-Technology-with-Profound-Image-Clarity-3367.aspx [genewscenter.com] And if you read through the Intel and GE press releases, you'll find numbers all over the map as to how much this actually decreases radiation exposure. It might be a 4x reduction (GE scientist quoted in Intel article), it might be a 10x reduction (Intel article), or it might be a 100x reduction (GE article). It might just depend on the specific scan being
  • Except for some vague references to parallelizing (alleged) essentially single threaded code and peephole optimization, what have they done? Where's the great advance in software that the headline promises? How are the new algorithms difference from the old ones? Where's the statement of the underlying difficulties? TFA is inexcusable puffery.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...