Einstein@Home Set To Break Petaflops Barrier 96
hazeii writes "Einstein@home, the distributed computing project searching for the gravitational waves predicted to exist by Albert Einstein, looks set to breach the 1 Petaflops barrier around midnight UTC tonight. Put into context, if it was in the Top500 Supercomputers list, it would be in at number 24. I'm sure there are plenty of Slashdot readers who can contribute enough CPU and GPU cycles to push them well over 1,000 teraflops — and maybe even discover a pulsar in the process."
From their forums: "At 14:45 we had 989.2 TFLOPS with an increase of 1.3 TFLOPS/h. In principle that's enough to reach 1001.1 TFLOPS at midnight (UTC) but very often, like yesterday, between 22:45 and 22:50 there occurs a drop of about 5 TFLOPS. So we will have very likely hit 1 PFLOPS in the early morning tomorrow. "
Re:10001.1 TFLOPS, eh? (Score:4, Informative)
I don't think the poster is a native speaker and I fixed a bunch of other obvious typos... but missed that extra zero there.
Re:10001.1 TFLOPS, eh? (Score:5, Funny)
A zero is nothing, therefore you missed nothing.
Re: (Score:3)
But he could have missed it to a higher degree of precision!
E@H success (Score:1, Informative)
Although Seti@Home is probably the most known project (or used to be), E@H is probably the most successful one from the pure science perspectives. They have actually managed to discover new pulsars that nobody has seen before, and unlike some slightly shady DC projects (some of them being actually for-profit), their data is accessible. Good job E@H team!
Re: (Score:2)
Although Seti@Home is probably the most known project (or used to be), E@H is probably the most successful one from the pure science perspectives.
Unlike the pure science of Folding@home?
Top500 doesn't work that way (Score:1)
You have to run the Linpack benchmark and report that.
Re:Top500 doesn't work that way (Score:4, Insightful)
You have to run the Linpack benchmark and report that.
And I guess no distributed computing platform is ever going to score in top 500 according to that benchmark. The communication performance between nodes is very important to most parallel algorithms. Any decent benchmark would take that into account. A real super computer has much faster communication between the nodes, than what you can achieve across the Internet. Both throughput and latency matters. There are some specific problems which can be split into parts that can be computed independently by nodes without communication between them, but most super computers are used for tasks, that do not fall into that class.
At some point I heard the rule of thumb, that when you are building a super computer, you divide your funds in three equal parts. One of those was to be spent on the interconnect. I don't recall what the other two were supposed to be spend on.
Re: (Score:3, Funny)
At some point I heard the rule of thumb, that when you are building a super computer, you divide your funds in three equal parts. One of those was to be spent on the interconnect. I don't recall what the other two were supposed to be spend on.
Hookers and blow.
Re: (Score:1)
I agree. That's what makes all the difference between such distributed systems or simple clusters, and real massively parallel supercomputers, with a single kernel instance running on thousands of CPU cores accessing the same distributed shared memory. Nothing to do with opportunistic distributed batch jobs, both don't compare.
folding@home (Score:2, Insightful)
wouldn't it be wise for practical* reasons for people to offer more power to folding@home instead of einstein@home?
* = has more chances to help humanity ( for curing diseases etc. )
Re: (Score:2)
*shrug* I've never been as big fan of humanity so I don't really care about that. Gravity waves though... Flying cars, gravity gun... How can anyone resist?
But you kind of need humanity to a) develop new gravity wave technologies and b) then actually enjoy the benefits of said new technologies. Unless of course you see these new gravity wave devices as a lagniappe for your robot and AI friends.
Re: (Score:2)
Re: (Score:1)
giving someone an aspirine for his pain = giving a person a fish
finding a permanent cure for a genetic disease = teaching someone how to fish ( but for a totally different problem than eating )
Re: (Score:2, Interesting)
No. But if we wanted, we could do both.
But play a little mind game. Imagine that you are a super genius, who could create a magical box within an hour. This box could create anything from nothing, even another similar box or cure for everything or food.
Would you rather spend your whole life helping Africa than inventing this box? Considering that with the box, you could help Africa also.
If yes, how about if it would take 2 hours? 4? A year?
But you are not a genius and the box is not a box. The box might be
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Interesting)
genuine question:
wouldn't it be wise for practical* reasons for people to offer more power to folding@home instead of einstein@home?
* = has more chances to help humanity ( for curing diseases etc. )
Or, to put it another way - why waste resources studying astronomy when there are so many sick people in the world so it would be better for humanity to put our resources into curing disease?
Re: (Score:2)
Or maybe you are too short sighted to see that this could potentially help everyone on the planet instead of a microcosm of it.
It could lead to a cure for limited resources, which, if I'm not mistaken, is part of the cause of the poverty in Africa.
Well no, not yet- humanity could feed everyone on the planet today if they could find a way past artificial political barriers that are preventing it. And if there were no political and/or religious barriers against promoting education and contraception, we could continue to house everyone on this planet indefinitely.
It would be nice for humans to be able to escape the planet some day to ensure survival even if there's a catastrophic event on this planet, but saying that we need to leave the planet because
Re: (Score:1)
You are right, how dare DARPA and PARC create all this silly technology. There is no way that the internet or computers could improve the lives of everyone in the world.... [end sarcasm]
That is essentially your argument because you fail to see how this new technology could be used to help people. Just because you don't see it doesn't mean is isn't useful.
I never meant to imply that this would only be used for getting us off this rock in case of catastrophe. That is one possible use. Maybe we could bring
Re: (Score:1)
Here, I'll tee it up for you.
We're all going to die. Why bother doing anything at all?
Pretty easy statement to refute; same refutation works for all the other flavors as well.
But to address the OP, they are my flops, and I can do with them as I please; just as you can do whatever you like with yours.
Re: (Score:2)
Re: (Score:2)
Or, to put it another way - why waste resources studying astronomy when there are so many sick people in the world so it would be better for humanity to put our resources into curing disease?
Protien folding simulation is such a large and basic need globally there ought to be enough large scale interest to make development of specialized ASICs to deal with these problems cost effective and exceedingly useful for all who need to do these simulations. A quick check of google shows such chips do in fact exist with unbelivable performance figures which kick the snot out countless tens of thousands of CPU/GPUs. There is no shortage of funding for medical research so it begs the question why waste C
Re: (Score:3)
Re: (Score:1)
Protien folding simulation is such a large and basic need globally there ought to be enough large scale interest to make development of specialized ASICs to deal with these problems cost effective and exceedingly useful for all who need to do these simulations. A quick check of google shows such chips do in fact exist with unbelivable performance figures which kick the snot out countless tens of thousands of CPU/GPUs. There is no shortage of funding for medical research so it begs the question why waste CPU/GPU resources on folding simulations?
I still do seti and milkyway at home because there are no resources allocated for seti and milkyway at home is interesting to me personally.
First of all, protein folding is not the only thing they do, the Folding@HOME infrastructure is used by many for a variety of bio-molecular studies [stanford.edu].
Secondly, custom ASIC-based machines like Anton [wikipedia.org] and MDGRAPE [wikipedia.org] (which are AFAIK the only such machines around these days) consist of much more than a custom-chip, they use specialized interconnects, memory, software, etc. and cost a lot. The MDGRAPRE-4, the coming version of the Riken-developed custom molecular simulation machine costs $10M + $4M (development + [www.cscs.ch]
Re: (Score:1)
Sure. But wouldn't it be wise for practical reasons for you to go out and help out at a local soup kitchen instead of posting on Slashdot?
People are going to do what they want to do. Of the people that share CPU/GPU, more people are interested in Einstein.
Re:folding@home (Score:5, Interesting)
Discovery is not usually a straight line.
I donate to SETI@Home, Einstein@Home, LHC@Home, and a bunch of projects at WorldCommunityGrid. BOINC and GridRepublic makes this easy. I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.
Re:folding@home (Score:4, Interesting)
Someone who only knows physics might not be able to help medical research, so scientific resources aren't entirely fungible. But CPU cycles are. So contributing to one particular distributed computing project does carry an opportunistic cost of not supporting another.
Going off on a tangent here, while I echo your sentiment that people should be free to support whatever distributed computing project they want, I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times, and have been listening for decades without finding anything. The entire project operates off the assumption that interstellar communication of another intelligent life form would occur over radio waves.
Requisite XKCD:
http://xkcd.com/638/ [xkcd.com]
If someone is contributing cycles to it, and not protein folding, then valuable medical research (that has been proven worthwhile) might be suffering literally out of ignorance. That is worth pointing out.
Re: (Score:1)
Going off on a tangent here, while I echo your sentiment that people should be free to support whatever distributed computing project they want, I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times, and have been listening for decades without finding anything. The entire project operates off the assumption that interstellar communication of another intelligent life form would occur over radio waves.
well, the seti@home project may be in disarray, but it's a bit early to say that seti (search for extra-terrestrial intelligence) in general has failed, isn't it? a few decades of silence from potential civilizations that may potentially be thousands, millions or even billions of light years away can hardly be construed as strong evidence.
Re: (Score:3)
The concept of searching for extra-terrestrial life hasn't failed, but their project of just scanning radio waves basically has. If another civilization used radio for interstellar broadcasts, we'd see steady, regular broadcasts. When we blanket a spectrum from a physical direction and don't see anything, it suggests no one is broadcasting radio waves.
There may be technologically advanced life forms out there broadcasting by other means, but repeatedly checking radio waves probably won't offer any real bene
Re: (Score:2)
Re:folding@home (Score:4, Informative)
I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times
The entire spectrum? We've only looked at one frequency range on 20% of the sky:
SETI@home is basically a 21-cm survey. If we haven't guessed right about the alien broadcasters' choice of hailing frequency, the project is barking up the wrong tree in a forest of thousands of trees. Secondly, there has been little real-time followup of interesting signals. Lack of immediate, dedicated followup means that many scans are needed of each sky position in order to deal with the problem of interstellar scintillation if nothing else.
With its first, single-feed receiver, SETI@home logged at least three scans of more than 67 percent of the sky observable from Arecibo, amounting to about 20 percent of the entire celestial sphere. Of this area, a large portion was swept six or more times. Werthimer says that a reasonable goal, given issues such as interstellar scintillation, is nine sweeps of most points on Arecibo's visible sky.
Quoted from http://www.skyandtelescope.com/resources/seti/3304561.html?page=5&c=y [skyandtelescope.com]
Also, when there is no work to be done, your computer can look at other things.
I donate my time to several medical studies that will likely find some results that will help all people. I also donate some time to climate research that has less of a chance of helping EVERYONE. I also donate some time to SETI which has a very, very small chance of changing the world.
It is called hedging your bets. I spend some CPU on things with low risk and low reward, and others on things with high risk and high reward.
Re: (Score:1)
The XKCD reminds me of a story that back in the 18th century, people wanting to know if the moon was inhabited used their telescopes to look for signal fires.
Re: (Score:2)
I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.
Not only are there a lot of projects like this, most of them - whatever their intrinsic scientific merit - have very little direct application to fighting disease. Sure, the people directing the projects like to claim that they're medically relevant, but this is largely because the NIH is the
Re: (Score:2)
I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.
Not only are there a lot of projects like this, most of them - whatever their intrinsic scientific merit - have very little direct application to fighting disease. Sure, the people directing the projects like to claim that they're medically relevant, but this is largely because the NIH is the major source of funding. It's also really difficult to explain the motivations for such projects to a general audience without resorting to gross oversimplifications. (This isn't a criticism of protein folding specifically, it's all biomedical basic research that has these problems.) My guess is that it will take decades for most of the insights gleaned from these studies to filter down to a clinical setting.
The project that is arguably more relevant to disease is Rosetta@Home, but that's because of the protein design aspect, not the structure prediction. (In fact, Rosetta doesn't even do "protein folding" in the sense that Folding@Home does - it is a predictive tool, not a simulation engine like Folding@Home.)
Someone please mod this up, as a researcher in the same field as F@H I can attest this is all quite correct.
First, I should preface this by saying I've interacted with several of the F@H folks professionally and they do excellent work. And that the NIH is under no pretenses when it funds this work that cures will magically pop out tomorrow - they think of it as a seed investment for a decade or two in the future. In terms of tax dollars spent, it's a good investment considering many biomedical labs spend mo
Re:folding@home (Score:4, Interesting)
Malaria is known to be in the US and has several medications to treat it. The CDC will tell you that Schistosoma does not even exist in the US, but I acquired it at the age of 10, and it wasn't until I purchased my own lab equipment around the age of 50 that I finally got an answer to all my bizarre health problems. Statistically I should be dead, several times over. Over 200,000 people die from it every year, and I am clearly one of the lucky ones.
There is currently only one drug (praziquantel) to "cure' (with 60% efficacy) Schistosoma, and it is quickly loosing its effectiveness. There is no other substitute. None. After visiting many pharmacies in my area, it took me three days for me to locate the drug in the USA and tell the Pharmacy where they could get it for me. . Yes Its that bad. Funny thing is I can buy it off the shelf for my dog, with a prescription, but I couldn't buy it anywhere for human consumption? Clearly we need more options and SNTS protein folding analysis will help with that goal.
If you have a few extra CPU cycles to spare, please sign up for one of these two worthy causes!
More info on Schistosomiasis
https://en.wikipedia.org/wiki/Schistosomiasis [wikipedia.org]
https://en.wikipedia.org/wiki/Praziquantel [wikipedia.org]
Re: (Score:3)
Drugs for dogs are just as "pure" as those for humans.
On the Internet, no one knows you're a dog.
Re: (Score:3)
Re: (Score:2)
I was thinking, why not Bitcoins? At least there's a probability of getting money for those spare cycles.
That's nothing.. (Score:1)
The Bitcoin network combined processing power is 287.78 Petaflops. roughly 300 times larger.
Re: (Score:1)
You better start now, because I got a hell of a lead.
Re: (Score:3)
Re: (Score:1)
"Isn't Bitcoin's FLOPS number just an estimate, and a grossly inaccurate one based on the wrong assumptions that there's a single formula for estimating FLOPS with just integer performance, and the formula's applicable to all platforms?"
All true.
But even if 287 Petaflops is grossly inaccurate, it's *still* gonna be more than 1 Petaflop. I'm almost sure about that...
Re: (Score:2, Informative)
You're not actually producing bitcoin, you're just competing to win them. That is: nothing new is created by your participation in the bitcoin network, the best you can hope for is that you'll receive something which otherwise would've gone to someone else. It helps you personally, but is 0 sum for the world.
With einstein@home and folding@home, you are helping to solve big science problems. These problems will be solved faster with your help than without it. It is a net gain for science and humanity.
Re: (Score:1)
You mean put it towards a scam
Found /. through distributed.net back in the day (Score:3)
The reason I found slashdot back in the 90s was due to the team performance on the distributed.net tasks. So they do turn those cycles into something useful!
One Petaflop, uh? (Score:1)
I'm sorry, but that's 1024 Teraflops to me.
You're still off by ~20 hours.
Re: (Score:1)
Good call on the meaning of 'tera' :)
As to the rate, it's increasing at 1.3Tflop per sec per hour, so yes, it's an acceleration. Even if it has slowed a bit...now about 0.8Tf/s/h (i.e. the rate of the acceleration is decreasing).
Re: (Score:1)
E@h is GPL V2 [uwm.edu] actually. Thinking of something else?
Any suggestions for a distributed client? (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I have a iMac late 2009 with a Core i7 processor, running Mountain Lion. It runs BOINC manager 7.0.31 and the projects Collatz, Einstein, SETI and climateprediction.net. Of all, only Collatz uses my ATI GPU for OpenCL computations, but BOINC respects the resource share of CPU and GPU time. For NVidia you can run Einstein, Collatz and PrimeGrid.
Also, the latest BOINC clients can be configured to not use your CPU at all if you are running processor intensive apps like Final Cut, iPhoto or iMovie. Useful when
Re: (Score:2, Informative)
I think Folding@home uses its own specialized client. I've never used it, so I can't help you there. Most of the other distributed (grid) projects out there use the BOINC [berkeley.edu] client. BOINC allows you to schedule processor time to when you want to run, allows the stoppage of distributed processes once CPU usage reaches a certain (user-definable) level, and all sorts of other things. I don't think Folding@home allows the BOINC client to connect, however.
I think what is happening (in your case) is the folding
Re: (Score:2)
Re: (Score:1)
Whilst perhaps not as efficient, I find that running resource-hungry applications like these within a virtual machine with number of cores you want to limit it to is a good way to throttle the amount of resource they can consume
Contributing spare cycles at the lowest CPU clock (Score:2, Interesting)
I would like to contribute my spare CPU clock cycles, but without causing my CPU to speed up (in this case, with Intels SpeedStep) from the lowest setting at 800 MHz. Otherwise, my laptop gets hot and loud. How can I do that?
Re: (Score:1)
for i in /sys/devices/system/cpu/cpu[0-9]*; do echo 1 > $i/cpufreq/ondemand/ignore_nice_load; done
Not a barrier (Score:3)
1 PFLOPS is an arbitrary threshold or milestone. It's not a barrier because nothing special happens at that point. The speed of light is a barrier. Even the speed of sound is a barrier. 10^n somethings per whatever is rarely if ever a barrier for any positive integer n.
Power-efficiency (Score:1)
If it was at position 24 in the Top500, it would likely be 3x as power-efficient than having all these individual computers. These sort of initiatives are impressively inefficient (but very effective), this is why the 'cloud' model won the battle over the 'grid' model. It only works because computing power is donated, not paid for. On the other hand, the equivalent supercomputer would likely cost 3-8x the aggregate (wrt the sum of costs of all these computers), because of it being custom-made.
1000.2 TFLOPS reached! (Score:2)
I added two nVidia GTX 260 and one nVidia GT 240 card to Einstein @ Home , and voila this morning's stats show:
Page last updated 3 Jan 2013 8:50:02 UTC
Floating point speed (from recent average credit of all users) 1000.2 TFLOPS
For a BOINC novice it can be quite daunting to figure out how to make it use all GPUs and not accept any CPU-only work units. Editing some XML files in some data directory isn't exactly user friendly.
Re: (Score:2)
2xGTX260 are in theory about 1.5TFLOPS so that's welcome fuel on the fire :)
You can also configure common settings using the BOINC preferences [uwm.edu] and Einstein@home preferences [uwm.edu] pages. It seems common to use "location" to set up different preferences for different hosts, e.g. I use "home" setting for machines which are only good for GPU work, "work" for CPU-only systems and the "default" setting for both CPU/GPU (plus "school" settings for experimentation).
Also AIUI the latest client will use all your GPUs as lo