Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Biotech Supercomputing Science

Supercomputer Sets Protein-Folding Record 63

Nicros writes with this snippet from Nature News: "A specially designed supercomputer named Anton has simulated changes in a protein's three-dimensional structure over a period of a millisecond — a time-scale more than a hundred-fold greater than the previous record. ... The simulations revealed how the proteins changed as they folded, unfolded and folded again. 'The agreement with experimental data is amazing,' says Chandra Verma, a computational structural biologist at the Bioinformatics Institute of the Agency for Science, Technology and Research in Singapore. Simulating the basic pancreatic trypsin inhibitor over the course of a millisecond took Anton about 100 days — roughly as long as computers spent toiling over previous simulations that only spanned 10 microseconds."
This discussion has been archived. No new comments can be posted.

Supercomputer Sets Protein-Folding Record

Comments Filter:
  • by blind biker ( 1066130 ) on Sunday October 17, 2010 @05:20AM (#33922764) Journal

    ..it's a rather poor article. It talks in very basic terms about proteins and their folding, talks a bit more about the scientist who founded the institute behind the computer, and says fuck-all about the construction of the computer itself.

    Bah. For a publishing house of Nature Publishing Group's (intellectual and economic) muscle, one should expect more.

  • by Anonymous Coward on Sunday October 17, 2010 @08:43AM (#33923304)

    Results like this just prove (as if scientists with half a brain couldn't reason it out themselves, but our audience here is the plebes I think) that distributed scientific computing using home computers is a fundamentally bad idea; a waste of natural resources.

    Why? Think about it. This computer using specialized ASICs is both vastly faster and more efficient (measured in calculations-per-watt-hour) than running the simulations on general-purpose CPUs.

    Now most people don't buy home computers just to run F@H, but we must also consider the fractional purchased induced by the public's desire to participate in "science" (and fractional replacements because of wear, although it's probably not the CPU itself that gets worn in any appreciable amount).

    Electricity must be generated to run home computers at full capacity that would otherwise have been turned of or idle. This energy could have "bought" a hundred-fold more simulations had it been put into specialized simulation machines like this one. Too bad.

    Populist science at its worst..

  • by imsabbel ( 611519 ) on Sunday October 17, 2010 @09:00AM (#33923376)

    Nature and Science are not for hard science.

    If you just get articles from citation search its not obvious why, but if you ever see a print issue it becomes obvious:

    They cover a _huge_ range of fields. You can have articles about egyptian mummies, rainforrest status in south america, neutron scattering and virus chrystallography within 20 pages or so.

    So people have to write the arcticles in a way that at least readers from most of the fields involved can understand it and see why it is important. Otherwise, it would better to publish it in a publication of a narrower scope.

  • Re:not really (Score:3, Interesting)

    by StripedCow ( 776465 ) on Sunday October 17, 2010 @09:16AM (#33923460)

    According to the article, it now takes 100 days to do one simulation. If we had 100 times the processing power (maybe a little more accounting for overhead), then we could do it in one day. I'd say that would be possible today with sufficient financial support, or at least it could be a reality within a decade. In short, it still sounds promising to me..

  • Re:not really (Score:2, Interesting)

    by Anonymous Coward on Sunday October 17, 2010 @10:49AM (#33923892)

    100 days is for a 'hero run', the bread and butter runs last 1-4 days apiece and account for more like 20-100 microseconds of simulated time. One of the big innovations of this machine is that those runs would otherwise take months on other machines.

  • by Anonymous Coward on Sunday October 17, 2010 @11:02AM (#33923954)

    From experimental evidence we know the folding rates of certain proteins at various temperatures, we know the flow rates for ion channels, and so on. A lot of these macro-properties can't be tested in the short simulations that current computers can do, but they can easily be reached by the DE Shaw machine.

  • by HiThere ( 15173 ) <charleshixsn@ear ... .net minus punct> on Sunday October 17, 2010 @02:55PM (#33925418)

    It is complex, but you are ignoring the relative isolation between levels that exists in the human, and rat, body.

    Protein folding may be complex, but most of it is irrelevant detail. What's usually important is the final shape that one ends up with, e.g. But when wants to modify that process, then the details of that process become important. This is roughly equivalent to...at the level that I work, I pay no attention to how the compiler is going to optimize my code. If I wanted to modify that I'd need to pay attention to things at a much finer level of detail.

    It *is* true that people tend to oversimplify things they aren't dealing with directly. But to make it a fair statement it needs to be made fully *that* general. (This doesn't make you original assertion false, but observationally it *is* false. I've never known a knowledgeable geek that oversimplified the biochemistry of life in the way that you painted. I'm sure they exist, but they aren't, as you implied, common. If they are common among your friends, well, then you have some uncommon friends.)

Always draw your curves, then plot your reading.

Working...