Forgot your password?
typodupeerror
Medicine

GPUs Helping To Lower CT Scan Radiation 77

Posted by kdawson
from the healthy-green-glow dept.
Gwmaw writes with news out of the University of California, San Diego, on the use of GPUs to process CT scan data. Faster processing of noisy data allows doctors to lower the total radiation dose needed for a scan. "A new approach to processing X-ray data could lower by a factor of ten or more the amount of radiation patients receive during cone beam CT scans... With only 20 to 40 total number of X-ray projections and 0.1 mAs per projection, the team achieved images clear enough for image-guided radiation therapy. The reconstruction time ranged from 77 to 130 seconds on an NVIDIA Tesla C1060 GPU card, depending on the number of projections — an estimated 100 times faster than similar iterative reconstruction approaches... Compared to the currently widely used scanning protocol of about 360 projections with 0.4 mAs per projection, [the researcher] says the new processing method resulted in 36 to 72 times less radiation exposure for patients."
This discussion has been archived. No new comments can be posted.

GPUs Helping To Lower CT Scan Radiation

Comments Filter:
  • by jgtg32a (1173373) on Friday July 23, 2010 @11:03AM (#33003192)

    They said we'd use these processors for video games not medical technology

    http://www.youtube.com/watch?v=o66twmBEMs0 [youtube.com]

    • by onionman (975962) on Friday July 23, 2010 @11:11AM (#33003290)

      It's remarkable that high performance computing is driven by video games. So, legions of PC enthusiasts and uber-gamers, I salute you for your contributions to technology! P0wn on.

      • I translated your post to "Battle on, Heroes"
      • How is that remarkable? Just about every major function of a typical computer can be done with a low-end Celeron and 2 GB of RAM. Games are about the only thing that low-spec system can't do. And gamers enjoy paying lots and lots of money for the best. If I can do everything that I do now on a $300 computer, why would I pay $800 and get a quad-core unless I'm a gamer? Yes, there are a few other areas such as CAD and the like that need high-powered systems, but in the cost-conscious world, it is gamers that
        • Re: (Score:3, Insightful)

          by toastar (573882)

          Games are about the only thing that low-spec system can't do

          I take you never met someone who's job it is to solve the wave equation on very large datasets.

        • by poetmatt (793785)

          wow, obvious troll much?

          Ever tried to do anything graphically intensive on a celeron with 2gb of ram? Here's a hint: it won't work.

          • by Jeng (926980)

            Graphically intensive? I think that was his point, if you do not need a graphically intensive computer then a celeron with 2 gigs of ram will do you.

            He wasn't trolling, but you I'm not so sure of.

            • Re: (Score:2, Informative)

              by Andy Dodd (701)

              Matlab is rarely ever graphically intensive...

              • by Jeng (926980) on Friday July 23, 2010 @12:03PM (#33003936)

                Neither is Email or internet usage.

                I'm pretty sure the comment was for general usage, which is normally just Email and internet usage with some office apps thrown in. That is what a celeron with 2gigs of ram would be sufficient for.

                Yes, there are many many programs that are used in many fields that would not fit into the celeron with 2 gigs comment. I work in an office environment, we don't need massive processors, we don't need massive video cards, all we need is a low end processor with a good amount of ram.

                That is what I got from reading his comment, but apparently I am in the minority.

              • Re: (Score:3, Interesting)

                by Score Whore (32328)

                You and a couple of others in this sub-thread are defining the problem backwards. As near as I can tell you're approach is to look at computer A and computer B and then to say "B is five times faster than A, therefore I need B." The correct way is to lay out your requirements: technical, financial and SLAs for delivery of your "product." Then to identify the system you need.

                While it's nice to be able to cache gigabytes of data, the reality is is that 2 GB is a fuckload of memory. Say you have a 21 MP camera

            • 2 GHz Celerons don't cut it for heavy web surfing, either. You'll probably have problems on YouTube. A low-end machine won't cut it for photo editing, or video editing, or audio editing.
              • by ooshna (1654125)
                Most people don't to heavy video, audio, or photo editing. The most people usually do are crop and resize photos for facebook or myspace and small editing of video for youtube. I used to do video editing on my old 466mhz celeron with 64mb of ram. Sure it was slow but most people don't need to edit and preview in real time.
            • by poetmatt (793785)

              maybe you want to look at what happens if you try to dual or tri-screen on an integrated graphics card. Hint: doesn't go well even at moderate resolutions.

              My point was, what people think is general work, it's not always cpu-focused. Some of the time? sure. Most of the time? I wouldn't say so.

      • by jandrese (485)
        Well, the market for actual high performance computers is way too small to fund the R&D necessary to build those crazy GPUs. The high performance computing folks should thank their lucky stars that games went in a direction that required more and more processing power (well in excess of what CPUs can provide) and that the GPU companies didn't decide to just leave them out in the cold. There have already been stories of some jagoff putting a few GPUs in a box and outperforming million dollar supercompt
        • I'm about to get a pair of nVidia C2050s so I'm really getting kick out of being one of those jagoffs.

          Suck it, Poisson equation. Suck all 16 million cells in under 90 seconds (under 10 once the 2050s arrive).
    • I miss 3DFX. Definitely my life's second technology love crush, after I got over Sony. My first video card setup was a trident and 2 Voodoo IIs in TRUE SLI.
  • by Chas (5144)

    But I want that mutation in the "rage center" of my brain!

    The ideas is I turn into this huge green, angry thing (currently all I'm lacking is the green pigmentation).

    Then it's BULK SMASH!

    • by maxume (22995)

      If you promise to do a bunch of hilarious stuff and also wear it at trial, I'll buy you some paint.

    • by ByOhTek (1181381)

      Well, would you accept an external supplement instead?

      ATI RAGE (appropriately named) cards had a similar effect on me about 8-10 years ago.

  • And start paying developers to make things in OpenCL instead of CUDA, or they're going to be quickly left behind.

    • by etherway (1842902)

      They've already been left behind, or else they would not have to pay developers to not use CUDA. Also, nVidia has better OpenCL support than ATI in terms of performance and stability despite the fact that it's not their first choice language for GPU development (obviously).

      What ATI actually needs to do is stop treating software development like some minor aspect of their GPU production that can be haphazardly tossed together. They have much, much better hardware than nVidia on paper and yet they are merel

  • by iPhr0stByt3 (1278060) on Friday July 23, 2010 @11:17AM (#33003360)
    So, they pump in all that radiation because the processor is too slow? Doesn't seem right to me. I would think if they could have simply put another $10000 into the machine (adding CPU cycles) to lower the required radiation they would have done that a long time ago. So is the use of a GPU just a side effect of some new technology that allows the machine to estimate or predict the image with a lower radiation dose? That GPUs are more effecient for some operations is nothing new - what's the real breakthrough here?
    • Pretty much.

      The reconstruction time ranged from 77 to 130 seconds on an NVIDIA Tesla C1060 GPU card, depending on the number of projections –-- an estimated 100 times faster than similar iterative reconstruction approaches, says Jia.

      So in essence they have built a parallel optimised calculation system rather than an iterative one, and we all know the one thing CUDA and OpenCL do VERY well is parallel processing.

      It seems the real win here is the new code, it could run on a TI-82 calculator and still require that level of radiation, its just that its very well suited to GPU to crunch.

    • by FTWinston (1332785) on Friday July 23, 2010 @11:26AM (#33003482) Homepage
      The TFA says that this tech is usually used prior to treatment, while the patient is in the treatment position.
      Because processing a limited number of scans into a useful model previously took several hours, they were forced to perform many more scans to get a more accurate picture with which to build their model - because they don't want to leave the patient lying in the scanner for 6 hours prior to treatment.
      With this improvement in processing power, they can produce the model from limited data in a feasable time.

      So the summary does actually describe the breakthrough quite well: It's not a new image processing technique for working with limited data, it's just new hardware allowing that process to be run in a quicker way. Yes they're using a slightly new algorithm, but I doubt that is a massive breakthrough in itself.
      • The TFA

        :(

      • by john83 (923470)
        I think it's being driven by recent work which suggest risks associated with the scans are a bit higher than previously thought. There's a perceived medical need to reduce the radiation. I'm afraid I can't put my finger on a citation though.
        • I have to imagine that there are all kinds of people working on software and hardware upgrades all over medical science/engineering. Decreasing the risk to patients might be a nice reason to upgrade these scanners in particular, but you sorta sound like 'if it wasn't for the risk to the patients, this upgrade wouldn't be needed anytime soon.'

          Engineers want to make better products, both to contribute and to make sales. Doctors want better products, both to decrease risk and to make their work easier and mo

      • by Achra (846023)

        Because processing a limited number of scans into a useful model previously took several hours, they were forced to perform many more scans to get a more accurate picture with which to build their model - because they don't want to leave the patient lying in the scanner for 6 hours prior to treatment. With this improvement in processing power, they can produce the model from limited data in a feasable time.

        Good lord. Am I the only one that is terrified by the idea that they are take several scans and trying to come up with a vague model of how your organs tend to move, and then firing a rather large dose of ionizing radiation at their best guess? I was under the misunderstanding that imaging guided radiation therapy was somewhat real time up until now.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      The real breakthrough is the development of Compressed Sensing/Compressive Sampling algorithms; this is just an application.

    • by guruevi (827432)

      At some point the amount of processors becomes insignificant because of the overhead and costs it will introduce. A Tesla C1060 costs ~$700 for these types of projects and has 240 processors designed to efficiently process this type of data, compare this to the cost and maintenance of a half-rack cluster this would take in generic processors.

    • Re: (Score:3, Interesting)

      by jandrese (485)
      My guess is that each scan requires a considerable amount of processing to render into something we can read on the screen. Probably billions of FFTs or something. You can make a tradeoff between more radiation (cleaner signal) and more math, but previously you would have needed a million dollar supercomputer to do what you can do with $10k worth of GPUs these days, which is how they're saving on radiation.
    • Re: (Score:3, Interesting)

      by Zironic (1112127)

      What's going on is that instead of taking a clear picture they take a crappy picture and have the ludicrously fast GPU clean it up for them. While you could have done that by just putting 50 CPU's in parallel the GPU makes it quite simple.

      The speed is important because their imaging is iterative, with the GPU they're apparently waiting 1-2 minutes, without the GPU it takes them 2-3 hours which is a rather long time to wait between scans.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        The technique is called iterative backprojection. The reconstruction process assumes an array of pixels which, at the beginning, are of some uniform value. It then looks at a ray of attenuation data from the CT projection (along this ray, the tissues in the target result in this degree of attenuation of the xray beam), and asks "how must the pixels along this ray be adjusted, so that their attenuation along the ray matches the data from the CT beam?". It does this for every measured ray taken during the acq

    • by alen (225700)

      this is expensive medical equipment. the costs are in the approval process and sales commissions. not in the cost of the hardware

  • Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

    • by Zironic (1112127)

      Well, less scans should translate into less power usage, less doctor time and less machine time which should mean a lower cost per patient. How significant that is is hard to tell though.

    • by russotto (537200)

      Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

      From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.

      • Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?

        From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.

        Certainly not from the perspective of a physician. I continually bear in mind the cancer risk for CT scans that I order....the problem is that what I'm scanning for is an immediate threat to life, so I have to take a long term potential risk to offset a more immediate, more probable, and higher risk.

        As for saving time...it is negligible...most new scanner (64 slice and up) process the images as quickly as the machine can scan. And even if there is a delay (e.g. 16 slice machines) most scans are put into

  • lower rad dose (Score:4, Informative)

    by SemperUbi (673908) on Friday July 23, 2010 @11:37AM (#33003624)
    CT scanning is associated with an increased risk of cancer in children [nih.gov]. This development will significantly lower that risk.
    • by jabuzz (182671)

      Any X-ray imagining protocol is associated with an increased risk of cancer in everyone. From memory I belive it is around 1 extra death per 1.3 million chest X-rays for example.

    • This development will significantly lower that risk.

      Eventually it might. The exact technique they are using is for planning a radiation _treatment_ (cone beam CT), not a _diagnostic_ (helical scan) CT. They are quoted at the bottom that it _might_ be applicable. There are probably 100 to 1000 diagnostic scans for every treatment protocol.

      "CT dose has become a major concern of medical community. For each year's use of today's scanning technology, the resulting cancers could cause about 14,500 deaths. "Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.

      There currently protocols that are used to lower the radiation dose for pediatric patients...the problem is that not all hospital use them. Except in a life threatening emergency, the parents should ask before a routine/

    • CT scanning is associated with an increased risk of cancer in children [nih.gov]. This development will significantly lower that risk.

      As a physics engineer experienced in the field of radiotherapy and familiar with the techniques mentioned in the /. article as well as certified in radiation safety I am sorry to say that although the radiation dose is reduced, it is only reduced in very specific cases, where it is actually not a real benefit. This technique is not used for normal CT scanning, used to diagnose in your average hospital.
      This technique is used for radiotherapy (and mainly for position verification of the organ to be irradiate

  • context (Score:1, Insightful)

    by Anonymous Coward

    These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.

    • These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.

      LOL...if it is a _lethal_ dose, why treat the patient?

      They are going to get a _theraputic_ dose of directed radiation to target a specific tumor bed. The reduction in the imaging scan portion will lower _total_body_ dosing.

      Not all body tissues deal with radiation the same way. Thyroid and small bowel mucosa are the most radio-sensitive tissues, while areas like bone and muscle are much more tolerant...If you can avoid thyroid cancer or radiation enteritis, you'll have or be a much happier patient.

    • Re: (Score:3, Informative)

      by budgenator (254554)

      "Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.
      It's probably applicable to diagnostic cone beam scans, which are the hot item in implant dentistry. The reason it's first applied to therapy scans is because the tissue surrounding the tumor suffers radiation from scattering of the therapeutic beam, making dosage reduction highly desirable.

  • This must be the first time anything associated with Tesla reduced radiation exposure....

  • Yeah yeah.. that's all fine and dandy, but how many FPS does it get in Crysis on ultra settings? (heh-heh)
  • Does this now mean we can get more points on BOINC for finding ET?
  • by HuguesT (84078) on Saturday July 24, 2010 @11:06AM (#33013516)

    This has been said elsewhere in this thread, the real breakthrough here is due to compressed sensing, but here are some extra information:

    1- Compressed sensing basically used the idea that it is not necessary to sample an image (or a projection in this case) everywhere because natural data is fairly redundant. This is why you can capture a 10 Mpixel image in a digital camera and have it compressed to a 2 Mbyte JPEG file without losing much visible information. Compressed sensing basically does the compression *before* the sampling and not after. Researchers at Rice University for instance built a working, one-pixel camera [rice.edu] using this brilliant principle.

    2- Compressed (or compressive) sensing was proposed by Emmanuel Candes [stanford.edu] and Terence Tao [ucla.edu] respectively at Stanford and UCLA. Tao is a recent Fields medalist. I recommend reading his blog if you like mathematics.

    3- This field is really less than 10 years old, it has completely turned on its head classical ideas about sampling-limited signal processing (Nyquist, Shannon, etc). It is a brilliant combination of signal, image processing and recent advances in combinatorial and convex optimization.

    4- However this is only the beginning. Because compression happens before sampling, you need to make so-called sparsity assumptions about the signal ; in other words you need to know a great deal about what you are going to try to image. In interventional therapy, precise imaging of the patient is made beforehand in a classical way (CT or MRI), and this kind of technique is only used to make fine adjustments as therapy is ongoing. This is extremely useful and safe because of lower radiation output and because the physicians know what to expect.

    5- Here the GPU is useful because it makes the processing fast enough to actually be used. It is an essential brick in the application, but of course not in the theory.

    Best.

Disclaimer: "These opinions are my own, though for a small fee they be yours too." -- Dave Haynie

Working...