3-D Software for 'Virtual Surgery' 59
Roland Piquepaille writes "Computer scientists at Brigham Young University (BYU) have developed a new software tool to perform 'virtual surgery'. This tool, dubbed 'Live Surface,' will allow surgeons to visualize in 3-D any part of a patient's anatomy with just a few clicks of a mouse. Similar software already exists, but according to the Deseret Morning News, Live Surface is interactive and fast. This software can be used for better diagnosis by physicians, but it might even suppress the need for some exploratory surgeries. The researchers add that Live Surface might even been used for special-effects in movies or games by extracting an actor's performance from a video clip."
A pioneer (Score:5, Funny)
Re:A pioneer (Score:4, Funny)
But if you want, we can still trade emails?
Re:A pioneer (Score:4, Funny)
Re:A pioneer (Score:2)
Yeah, its us. (Score:4, Informative)
We have a laproscopic surgery simulator for a mere $40k that will totally blow your mind. You can learn to stitch, tie knots, remove gall bladders, the works.
Re:Yeah, its us. (Score:1)
Re:Yeah, its us. (Score:1)
So far today... (Score:4, Funny)
I want my freakin' holodeck!
Re:So far today... (Score:2)
It seems to be inevitable.
So we'll soon have photoshop for the body. (Score:4, Funny)
We can only imagine, however, what the clone tool will be used for.
Re:So we'll soon have photoshop for the body. (Score:2)
But will it run Linux? (Score:2)
Somebody research it for me. It's academic, so it has potential???
Integrate into Doom 3 (Score:2)
If only... (Score:2, Interesting)
Re:If only... (Score:2)
Ha ha, conspiracy theory. Great stuff.
They have the technology. I saw a demo of it in class and had the segmentation algorithm explained to me. Yes, I'm a graduate student at BYU.
Nice try baiting the moderators, though.
well now, (Score:2)
sure sounds like progress [imdb.com] to me...
Machine shop for the body? (Score:4, Interesting)
It all sounds so nice and efficient, but I can see so many things were this could go horribly wrong. I for one will be sticking with the over-worked, stim-taking resident who will be standing by my body. I don't feel comfortable with the medical industry moving in the same direction as the car manufacturing industry.
Re:Machine shop for the body? (Score:2)
Re:Machine shop for the body? (Score:2)
In any case I don't see how it can make things much worse. It's not like people have gone in to have the left leg amputaed and woke up with the right one gone. See one story here [telemed.org]
Re:Machine shop for the body? (Score:1)
Re:Machine shop for the body? (Score:2)
Not too far from the truth... (Score:2)
When my wife had a brain biopsy for her GBM, they did indeed get an MRI image and feed it into a computer in the operating room. The computer generated a 3-D image using the scan and aligned it to her head. The image was accurate to 0.06mm (I believe) and could even generate views "looking through the needle" so the Dr. (actually in the room) could avoid b
BYU press release link with more media and info (Score:5, Informative)
Re:BYU press release link with more media and info (Score:5, Insightful)
Having taken a vision class from Dr. Barrett (CS 750 at BYU), I can fill in some details. I might be able to dig up the paper later. I think you can find it in the latest SIGGRAPH proceedings - dunno if Citeseer has indexed it yet.
It's a segmentation algorithm that works well and fast in 3D images. It uses a graph-cut algorithm to classify voxels as inside or outside whatever you're trying to isolate. You (the doctor) lay down "seed" voxels with a mouse, clicky-clicky, and a few seconds later, the algorithm has isolated the structure. For example, say you want to isolate bone. Hold down the mouse button and move it over the bone. Hold down the other and move it over non-bone. If the algorithm makes a mistake, make some more seed voxels.
This is nothing new so far - the CV folks have been segmenting with graph-cut for ages. The problem is that it's very, very slow - minutes for a single segmentation. Barrett and Armstrong have developed a hierarchical version of the algorithm that uses watershed regions to presegment, and merges them as it runs. Doing graph-cut on large regions is a lot faster than doing it on single voxels. Their stuff is the first interactive speed, seeded 3D segmentation algorithm that produces quality results.
I saw the demo in class. It was really rather impressive, if you're familiar with the subject area.
Re:BYU press release link with more media and info (Score:2)
Does it have Clippy? (Score:4, Funny)
Would you like to:
too easy (Score:1, Funny)
Sometimes the gags just write themselves
BSOD (Score:5, Funny)
But does it run on Linux? (Score:2)
Re:BSOD (Score:2)
Been there, done that.... (Score:2, Informative)
Re:Been there, done that.... (Score:2)
Grits anybody? (Score:2)
old ideas, new clothes (Score:3, Informative)
The 3D model is an interesting way to put the MRI / CAT data on a computer screen (and far better than the .bmp's of a frog's organs) but what advantage (besides eye-candy) does this offer over looking at the raw MRI or CAT results?
One thing that could make this a great learning tool is an interesting interface that would help one practice a surgery with something more than a mouse or touch screen. Nintendo and Altus have already created a toy that does this, a far more intricate and realisitic version could be of use: http://ds.ign.com/objects/695/695152.html [ign.com]
Re:old ideas, new clothes (Score:3, Informative)
The answer is pretty simple. Doctors have to deal with information overload, and 3D models are an effective way of managing huge amounts of data. Consider: A typical MRI exam contains 60 to 90 slices. Looking at a single 3D image is much more efficient than looking throug
my own 3D anatomy (Score:4, Interesting)
So I take the CD, and find it has 3D visualization software on it. I ran it and told it to load all the cat scan slices. After it thought about things for a minute, Pow! Full 3D rotatable torso, I could dive in/out up/down whatever. I could change various colors and such to help see embedded structures like biliary tracts of the liver, or the tracts inside the kidneys.
Having been so close to a high end medical operation like a liver transplant for several months, I saw some wicked imaging tools. The ultrasounds they use to monitor my daughters new liver actually colors all the blood flow in blue and red (i.e. venous and arterial, though it is arbitrarily selected I understand) and you can move a trackball around to measure the instantaneous velocity of bloodflow in various veins or arteries in cm/sec with the click of a button.
You can bet that in 20-30 years this stuff is going to be VERY high end and we're going to stand a lot better chance at surviving some bad stuff. "Watch now! The nanobots are just reaching the clogged vessel as we speak, and you can see the bloodflow is already up by 1%, yes look here they have begun to expel the media into the colon!"
2 things (Score:2)
2) How is your daughter?
Re:2 things (Score:2)
Probably not on the Internet for general access, one might hope. I don't doubt that they are online for Washington University doctors to access. I had ~just~ gotten a chest X ray as well and found a copy of it on the CD as well, so I think they are pure digital with instant access in that regard.
Blood lab monitoring was another nice aspect of being at Childrens Hospital. When you are having a liver transplant, your blood is checked over and over for liver function levels, blood cl
Re:2 things (Score:2)
Re:my own 3D anatomy (Score:3, Interesting)
I design the 3D diagnotic interfaces to these systems and I love my job =)
We just got a GE 3T Mri magnet put in at our flagship clinic in Greensboro, and it indeed has a magnet resolution of apparently 90nm (we were trippin on this when they fired it up for the first time...
The color ultrasounds are kind of a pain in the ass to deal with btw, and can get out of manageable control rather quickly. We had an cardio tech generate a dataset on a cardio ultrasound station spanning a 30GB resultset
Before the BYU jokes... (Score:2)
Already done for non-invasive surgery (Score:3, Informative)
A company I used to work for, haptica, developed this for key-hole surgery about 7 years ago
www.haptica.com
What was nice was that they used the Havoc physics engine - the Havoc boys were just round the corner from us in Dublin.
Those havoc boys, they knew how to party!
Is this the same BYU ? (Score:1, Offtopic)
http://www.physics.byu.edu/research/energy/htm7.ht ml [byu.edu]
"ABSTRACT
In this paper, I call for a serious investigation of the hypothesis that WTC 7 and the Twin Towers were brought down, not just by impact damage and fires, but through the use of pre-positioned cutter-charges. I consider the official FEMA, NIST, and 9-11 Commission reports that fires plus impact damage alone caused complete collapses of all three buildings. And I pr
extremely simple interface (Score:2)
It only has three basic commands, unlikes the dozens to hundreds I see on comparable medical-CAD software.
The commands are rotate view, add a chuck here, delete a chuck here. There sophisticated segmentation technques
mostly guess your intention right, i.e. do you want to display this bone or muscle, but you have to do some final adjustments.
Virtual Surgery = Operation by Milton Bradley (Score:1)
Another Silo of Innovation (Score:1)
BYU has applied for a patent on Live Surface, and Adobe will have nonexclusive licensing rights to the product, Barrett said.
This software is supposed to do great things, but now if you want to do these great things you have to either use a patent or buy proprietary software from Adobe. I wonder if the graduate student who helped write this program had government sponsorship of any kind.
This kind of innovation silo is immoral. In my opinion far more immoral than proprietary software