GPUs Helping To Lower CT Scan Radiation 77
Gwmaw writes with news out of the University of California, San Diego, on the use of GPUs to process CT scan data. Faster processing of noisy data allows doctors to lower the total radiation dose needed for a scan. "A new approach to processing X-ray data could lower by a factor of ten or more the amount of radiation patients receive during cone beam CT scans... With only 20 to 40 total number of X-ray projections and 0.1 mAs per projection, the team achieved images clear enough for image-guided radiation therapy. The reconstruction time ranged from 77 to 130 seconds on an NVIDIA Tesla C1060 GPU card, depending on the number of projections — an estimated 100 times faster than similar iterative reconstruction approaches... Compared to the currently widely used scanning protocol of about 360 projections with 0.4 mAs per projection, [the researcher] says the new processing method resulted in 36 to 72 times less radiation exposure for patients."
Voodoo lied to us (Score:3, Funny)
They said we'd use these processors for video games not medical technology
http://www.youtube.com/watch?v=o66twmBEMs0 [youtube.com]
Funny what drives the HPC market... (Score:4, Insightful)
It's remarkable that high performance computing is driven by video games. So, legions of PC enthusiasts and uber-gamers, I salute you for your contributions to technology! P0wn on.
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
Games are about the only thing that low-spec system can't do
I take you never met someone who's job it is to solve the wave equation on very large datasets.
Re:Funny what drives the HPC market... (Score:5, Funny)
Re: (Score:3, Insightful)
That sounds fun. Is it available on Steam?
Re: (Score:2)
Yup,
store.steampowered.com/app/8500
Re: (Score:2)
wow, obvious troll much?
Ever tried to do anything graphically intensive on a celeron with 2gb of ram? Here's a hint: it won't work.
Re: (Score:2)
Graphically intensive? I think that was his point, if you do not need a graphically intensive computer then a celeron with 2 gigs of ram will do you.
He wasn't trolling, but you I'm not so sure of.
Re: (Score:2, Informative)
Matlab is rarely ever graphically intensive...
Re:Funny what drives the HPC market... (Score:5, Insightful)
Neither is Email or internet usage.
I'm pretty sure the comment was for general usage, which is normally just Email and internet usage with some office apps thrown in. That is what a celeron with 2gigs of ram would be sufficient for.
Yes, there are many many programs that are used in many fields that would not fit into the celeron with 2 gigs comment. I work in an office environment, we don't need massive processors, we don't need massive video cards, all we need is a low end processor with a good amount of ram.
That is what I got from reading his comment, but apparently I am in the minority.
Re: (Score:3, Interesting)
You and a couple of others in this sub-thread are defining the problem backwards. As near as I can tell you're approach is to look at computer A and computer B and then to say "B is five times faster than A, therefore I need B." The correct way is to lay out your requirements: technical, financial and SLAs for delivery of your "product." Then to identify the system you need.
While it's nice to be able to cache gigabytes of data, the reality is is that 2 GB is a fuckload of memory. Say you have a 21 MP camera
Re: (Score:2)
I've done plenty of graphic design, I just don't use crappy tools. If your tool requires a full size copy of what the image was before every single change, then your tool is hopelessly naive in it's implementation.
If you're going to refute my claim, then refute my claim. All you did was say "In my opinion
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
maybe you want to look at what happens if you try to dual or tri-screen on an integrated graphics card. Hint: doesn't go well even at moderate resolutions.
My point was, what people think is general work, it's not always cpu-focused. Some of the time? sure. Most of the time? I wouldn't say so.
Re: (Score:2)
Re: (Score:2)
Suck it, Poisson equation. Suck all 16 million cells in under 90 seconds (under 10 once the 2050s arrive).
Re: (Score:2)
NOOOOOO! (Score:2)
But I want that mutation in the "rage center" of my brain!
The ideas is I turn into this huge green, angry thing (currently all I'm lacking is the green pigmentation).
Then it's BULK SMASH!
Re: (Score:1)
If you promise to do a bunch of hilarious stuff and also wear it at trial, I'll buy you some paint.
Re: (Score:2)
Well, would you accept an external supplement instead?
ATI RAGE (appropriately named) cards had a similar effect on me about 8-10 years ago.
ATI needs to get off its ass (Score:2)
And start paying developers to make things in OpenCL instead of CUDA, or they're going to be quickly left behind.
Re: (Score:1)
They've already been left behind, or else they would not have to pay developers to not use CUDA. Also, nVidia has better OpenCL support than ATI in terms of performance and stability despite the fact that it's not their first choice language for GPU development (obviously).
What ATI actually needs to do is stop treating software development like some minor aspect of their GPU production that can be haphazardly tossed together. They have much, much better hardware than nVidia on paper and yet they are merel
CPU speed determines req. radiation amount? (Score:5, Insightful)
Re:CPU speed determines req. radiation amount? (Score:4, Interesting)
Pretty much.
So in essence they have built a parallel optimised calculation system rather than an iterative one, and we all know the one thing CUDA and OpenCL do VERY well is parallel processing.
It seems the real win here is the new code, it could run on a TI-82 calculator and still require that level of radiation, its just that its very well suited to GPU to crunch.
Re:CPU speed determines req. radiation amount? (Score:5, Informative)
Because processing a limited number of scans into a useful model previously took several hours, they were forced to perform many more scans to get a more accurate picture with which to build their model - because they don't want to leave the patient lying in the scanner for 6 hours prior to treatment.
With this improvement in processing power, they can produce the model from limited data in a feasable time.
So the summary does actually describe the breakthrough quite well: It's not a new image processing technique for working with limited data, it's just new hardware allowing that process to be run in a quicker way. Yes they're using a slightly new algorithm, but I doubt that is a massive breakthrough in itself.
Re: (Score:2)
The TFA
:(
Re: (Score:2)
Re: (Score:2)
I have to imagine that there are all kinds of people working on software and hardware upgrades all over medical science/engineering. Decreasing the risk to patients might be a nice reason to upgrade these scanners in particular, but you sorta sound like 'if it wasn't for the risk to the patients, this upgrade wouldn't be needed anytime soon.'
Engineers want to make better products, both to contribute and to make sales. Doctors want better products, both to decrease risk and to make their work easier and mo
Re: (Score:2)
Ah, interpolation, aka. making up data. This doesn't seem like a brilliant idea for purposes with accuracy is important.
I do acknowledge however that if your bullet is 10 mm in diameter and your target is 5 mm in diameter, you probably don't need a precise surface map of the target as long as you know where it's at within three or four mm.
Re: (Score:2)
Ah, interpolation, aka. making up data. This doesn't seem like a brilliant idea for purposes with accuracy is important.
Doing the scan quickly and then filling in the missing data computationally is becoming better than doing the scan slowly due to movement. People cannot remain perfectly still (breathing, etc.), so if you do the scan more quickly, you get less motion and less burring.
Re: (Score:2)
Re: (Score:2, Interesting)
The real breakthrough is the development of Compressed Sensing/Compressive Sampling algorithms; this is just an application.
Re: (Score:2)
At some point the amount of processors becomes insignificant because of the overhead and costs it will introduce. A Tesla C1060 costs ~$700 for these types of projects and has 240 processors designed to efficiently process this type of data, compare this to the cost and maintenance of a half-rack cluster this would take in generic processors.
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
What's going on is that instead of taking a clear picture they take a crappy picture and have the ludicrously fast GPU clean it up for them. While you could have done that by just putting 50 CPU's in parallel the GPU makes it quite simple.
The speed is important because their imaging is iterative, with the GPU they're apparently waiting 1-2 minutes, without the GPU it takes them 2-3 hours which is a rather long time to wait between scans.
Re: (Score:2, Informative)
The technique is called iterative backprojection. The reconstruction process assumes an array of pixels which, at the beginning, are of some uniform value. It then looks at a ray of attenuation data from the CT projection (along this ray, the tissues in the target result in this degree of attenuation of the xray beam), and asks "how must the pixels along this ray be adjusted, so that their attenuation along the ray matches the data from the CT beam?". It does this for every measured ray taken during the acq
Re: (Score:2)
this is expensive medical equipment. the costs are in the approval process and sales commissions. not in the cost of the hardware
Re: (Score:2)
They (NVIDIA) say that you could have a very cheap supercomputer for just $10k, made with Nvidia GPUs only. Pretty impressed achievement, and btw they also say that their GPUs are in fact faster that the normal, Intel/AMD CPUs. I don't know about you, but once my piggy bank is full, i will get one of these super-computer monster.
more like 20-40k
Each 4 GPU node costs us about 5k, the thing is you can do with a 4 node GPU cluster what would normally take 50-100 CPU's or about 10-15 nodes.
Re: (Score:2)
Keep in mind some of the operations you have to do like data sorts, and other stuff I class as overhead, doesn't get a speedup from the GPU's
Re: (Score:2)
Electricity (Score:1)
Neat. Does this also reduce the running costs of the machines, or would that be a negligible benefit compared to not irradiating your patients?
Re: (Score:2)
Well, less scans should translate into less power usage, less doctor time and less machine time which should mean a lower cost per patient. How significant that is is hard to tell though.
Re: (Score:2)
From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.
Re: (Score:1)
From the point of view of the hospital? It's the other way around; increasing the lifetime of the expensive X-ray tube (which this will indeed do) is the important benefit, and not irradiating your patients as much is just a side effect.
Certainly not from the perspective of a physician. I continually bear in mind the cancer risk for CT scans that I order....the problem is that what I'm scanning for is an immediate threat to life, so I have to take a long term potential risk to offset a more immediate, more probable, and higher risk.
As for saving time...it is negligible...most new scanner (64 slice and up) process the images as quickly as the machine can scan. And even if there is a delay (e.g. 16 slice machines) most scans are put into
lower rad dose (Score:4, Informative)
Re: (Score:2)
Any X-ray imagining protocol is associated with an increased risk of cancer in everyone. From memory I belive it is around 1 extra death per 1.3 million chest X-rays for example.
Re: (Score:1)
This development will significantly lower that risk.
Eventually it might. The exact technique they are using is for planning a radiation _treatment_ (cone beam CT), not a _diagnostic_ (helical scan) CT. They are quoted at the bottom that it _might_ be applicable. There are probably 100 to 1000 diagnostic scans for every treatment protocol.
"CT dose has become a major concern of medical community. For each year's use of today's scanning technology, the resulting cancers could cause about 14,500 deaths. "Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.
There currently protocols that are used to lower the radiation dose for pediatric patients...the problem is that not all hospital use them. Except in a life threatening emergency, the parents should ask before a routine/
Re: (Score:1)
CT scanning is associated with an increased risk of cancer in children [nih.gov]. This development will significantly lower that risk.
As a physics engineer experienced in the field of radiotherapy and familiar with the techniques mentioned in the /. article as well as certified in radiation safety I am sorry to say that although the radiation dose is reduced, it is only reduced in very specific cases, where it is actually not a real benefit.
This technique is not used for normal CT scanning, used to diagnose in your average hospital.
This technique is used for radiotherapy (and mainly for position verification of the organ to be irradiate
Re: (Score:1)
To put the received conebeam CT dose in perspective: The biological dose received from on such CT scan [springerlink.com] is about as high as a few hrs long haul flight (considering the effective dose received per hour as stated by BA [britishairways.com]).
Oops, strike that. Mixing up the magnitude orders. It should read a few hundred hrs of long haul flight. :blush:
context (Score:1, Insightful)
These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.
Re: (Score:1)
These patients are about to get RADIATION THERAPY. This CT scan will be delivered immediately before they are to receive a lethal radiation dose at the same location to kill their tumor. Reduction of dose in diagnostic CT (not cone-beam) is a much more valuable accomplishment.
LOL...if it is a _lethal_ dose, why treat the patient?
They are going to get a _theraputic_ dose of directed radiation to target a specific tumor bed. The reduction in the imaging scan portion will lower _total_body_ dosing.
Not all body tissues deal with radiation the same way. Thyroid and small bowel mucosa are the most radio-sensitive tissues, while areas like bone and muscle are much more tolerant...If you can avoid thyroid cancer or radiation enteritis, you'll have or be a much happier patient.
Re: (Score:3, Informative)
"Our work, when extended from cancer radiotherapy to general diagnostic imaging, may provide a unique solution to solve this problem by reducing the CT dose per scan by a factor of 10 or more," says Jiang.
It's probably applicable to diagnostic cone beam scans, which are the hot item in implant dentistry. The reason it's first applied to therapy scans is because the tissue surrounding the tumor suffers radiation from scattering of the therapeutic beam, making dosage reduction highly desirable.
Tesla? (Score:2)
This must be the first time anything associated with Tesla reduced radiation exposure....
Crysis (Score:1)
SETI (Score:1)
The real hero here is compressed sensing (Score:3, Informative)
This has been said elsewhere in this thread, the real breakthrough here is due to compressed sensing, but here are some extra information:
1- Compressed sensing basically used the idea that it is not necessary to sample an image (or a projection in this case) everywhere because natural data is fairly redundant. This is why you can capture a 10 Mpixel image in a digital camera and have it compressed to a 2 Mbyte JPEG file without losing much visible information. Compressed sensing basically does the compression *before* the sampling and not after. Researchers at Rice University for instance built a working, one-pixel camera [rice.edu] using this brilliant principle.
2- Compressed (or compressive) sensing was proposed by Emmanuel Candes [stanford.edu] and Terence Tao [ucla.edu] respectively at Stanford and UCLA. Tao is a recent Fields medalist. I recommend reading his blog if you like mathematics.
3- This field is really less than 10 years old, it has completely turned on its head classical ideas about sampling-limited signal processing (Nyquist, Shannon, etc). It is a brilliant combination of signal, image processing and recent advances in combinatorial and convex optimization.
4- However this is only the beginning. Because compression happens before sampling, you need to make so-called sparsity assumptions about the signal ; in other words you need to know a great deal about what you are going to try to image. In interventional therapy, precise imaging of the patient is made beforehand in a classical way (CT or MRI), and this kind of technique is only used to make fine adjustments as therapy is ongoing. This is extremely useful and safe because of lower radiation output and because the physicians know what to expect.
5- Here the GPU is useful because it makes the processing fast enough to actually be used. It is an essential brick in the application, but of course not in the theory.
Best.
Most informatie post ever! (Score:2)