Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Science

Donate Spare Cycles for Climate Prediction 131

gampid writes "The BBC has a story about the Casino-21 project which is running a SETI@home type program for climate prediction. " I'm a booster of Distributed.net, but this looks pretty cool as well. I dunno global warming just gives me the creeps anyway and I'd like to know if my house will be underwater.
This discussion has been archived. No new comments can be posted.

Donate Spare Cycles for Climate Prediction

Comments Filter:
  • I don't know if many of you have heard of the long range weather forecasting site, weatherplanner.com [weatherplanner.com]. This site claims a high percent accuracy in predicting the weather approximately 365 days ahead of time. I have been looking at the freely downloadable 30 planner for my city, and I was surprised to see that the forecast is extremely accurate, approx 75-90%. These forecasts are apparently based off of models, not just guessing. Just thought that might add something to the debate about the use of cpu cycles to figure out long range trends. If a private sector company with limited resources can do it, imagine how much better a super computer (via distribution) could do it!
  • you can take the model
    feed in data from 1950
    run it to 1999
    see how well it did.

    if it did pretty well, then its safe to assume that it might do an okay job from 1999-2049, then again it might not. but its worth a try :)
  • im going to have to run my computer for a while to get you those estimates

  • by Anonymous Coward
    Pennies a day? Wouldn't your electricity bill run at least that high from leaving the PC on all the time? We don't need to get paid for cycles, it's something you do for benevolent reasons, not selfish ones.
  • Yes, but then you add the new Lung fish data for a higher degree of accuracy, then add the next component and so on.. EVENTUALLY you will increase you're accuracy. It is still a probabilty of what the wheather will be, but with a higher degree of outcome.
    Thats one reason we only try to predict the wheather for a few days right now.
  • Yep. I'm just barely old enough to remember how we were headed into the 'next Ice Age' during the 70s. Guess they had to keep our minds off the oil crisis somehow. I'll join this project tho, if it means a chance to more accurately predict the weather either way.

    'Twould be supreme arrogance and presumptiveness to assume humankind could possibly effect such a huge change in only 150 years of Industrialism. Not that we've ever really been lacking in either one...

    Carlin has it right - the planet is fine... we're the ones who're fucked.
  • Why do we need to donate our computing cycles when the NOAA is getting a 1000 node beowulf cluster? (slashdot story [slashdot.org]) Don't waste your time. Let your computer idle. Don't you know that when it sits there in any modern os, such as LINUX (or NT), the os sends the HLT instruction, which causes the processor to idle, which means 1) less power consumption and 2) less heat, which means longer life? Even win9x users can use a program like Cpuidle to do this.
  • SETI has know govermant ties. Thanks to that SOB senator from Nevada.
  • of course, that's the idea -- but the idea of being "slightly more accurate" over the course of multiple equations, each depending on the previous one, means that no matter how "almost accurate" your model is, by the time you extrapolate that .0000001% error into 2^99999999999 equations, you're dealing with essentially useless data in the end.

    As you point out, we can only predict weather for a very short period of time now, based on what very few atmospheric factors we truly understand. So I guess what i'm asking is, if these guys know a more accurate way to predict the weather, why don't they share it with the rest of the world? if they don't know one, how the heck do they expect to predict even the most general pattern of average temperature 100 years into the future?

    It's like suggesting every mayor should run their office by playing SimCity -- SimCity is a fun game, and certainly takes into account varied things like weather, accessability, traffic patterns, and economic development. But no one would seriously consider that it is capable of accurately predicing how a REAL city would develop, as it's obviously crippled in the factors it can consider.

    i DO believe that some day we will have the capability to do this sort of calculation -- but probably not until computers are able to LEARN in the sense of being able to provide their own analysis of factors involved, and constantly update their own processes according to new data as it is discovered and verified (and weighed)...
  • by MagusOceanus ( 61084 ) on Friday October 15, 1999 @04:22PM (#1610247)
    I get calls to donate "spare" money to this and that charity during dinnner hours on my private phone.

    My "spare" organs are going to be donated in the case of an untimely death.

    People keep trying to bum "spare" cigarettes and gum and change from me.

    My "spare" clothes are being worn by freinds and family.

    My dog and cat hover over me every meal as if to say "buddy can you spare some of your food"

    A crazy chick wants my "spare time" so she can whine to me about her Jerry Springer Guest of a boyfriend.

    Now Seti and/or atmospheric scientists want any spare cycles... I have to draw a line somewhere don't I?

    I bet if we do translate a message from ET it will say the following "Hey neighbor can you spare a planet!!!!!"
  • I have to agree -- i think all this "monte carlo simulation" will prove is that the factors taken into consideration will show themselves quite clearly when calculated a billion times against themselves.

    Start with the assumption that deforestation will remove X amount of oxygen from the atmosphere? Guess what -- our results showed that, in 75% of the simulations done where deforestation continued at current rates, it resulted in X amount of oxygen being removed from the air, so we better stop deforestation quickly!
  • Macintosh users, start your Velocity Engines. (assuming this client will be accelerated for AltiVec)

    "The number of suckers born each minute doubles every 18 months."
  • We're only one step away from that global weather machine... Dr. Evil keep your eyes on this.
  • Well, I guess I get to add this along with running SETI and Distributed. I wonder how many other organizations are considering doing something like this. I think it's a neat idea, and I like the idea of distributed computing. It makes me feel like I'm doing something, even though it's just a really small part.
  • First Distibuted.net, then Seti, now Casino-21. These are all wonderful solutions to large scale computing problems. I've been running Seti@, and Distributed for a while now, and (like most of you) I'm happy to give back to the comunity when ever I can.

    Spare computing horsepower (being a comodity that it is) is a powerful resource. Combining this with the faster/chaper connections (DSL/Cable) can make some comuting problems within reach!

    So why hasn't someone tryed to build a comercial model of this computeting solution? I'de gladly leave my home puter logged on for a few pennies a day. (Ebay?! are you listening?)

  • by NMerriam ( 15122 ) <NMerriam@artboy.org> on Friday October 15, 1999 @02:40PM (#1610255) Homepage
    I dunno -- as inherently INTERESTING as the subject is, I suspect it'll be nothing more than a huge waste of processor cycles.

    I mean, essentially we're starting out with random parameters (guided by what they offer as "realistic") and project forward with hypothetical rules on behavior to an unknown point in the future where we'll calculate the probability of a particular outcome?

    Seriously, there's so much speculation and guesswork in even building the system to time-progress the ecological model that it seems unrealistic. Add on top of that the fact that no valid data at all will be used in the calculation, and of course the obvious limitation on the possibel number of factors we'll be calculating (as opposed to the huge number of factors that really exist) and you're pretty much pissing in the wind.

    think about it this way -- assume everything in the simulation is perfect, including all the data it starts out with (two major assumptions) EXCEPT that a year from now we find out the reproductive cycle of the lungfish plays a greater than anticipated role in the production of kelp, which in turn is respoonsible for generating a large portion of the atmospheric oxygen. Every calculation done will be a complete waste of cycles and you'll have to start all over with the new ecological model.

    or am i missing something?
  • the problem with seti is that we can't even talk to dolphins, who are 98% similar to us in DNA. what the heck makes these people think we'll be able to talk to zorg the martian who communicates by shooting streams of cyanide out of his eyeballs?
  • They want to use computers so they can be wrong years in advance instead of just with the five day forecasts?
  • The model is almost certainly built upon the data from the previous decades, so it will by definition produce a good fit to those numbers. And that will of course mean nothing.

    As usual, everything is easy to predict, except the future...

  • This project is truly awesome. Many of you are complaining that weather predictions are inherently inaccurate because the initial conditions for the simulations are so hard figure out. This is absolutely true. A Cray can predict our weather, but if the initial conditions are wrong, then it is a pointless exercise. Garbage in = Garbage Out. The goal of this project is to overcome this fatal flaw by coming up with accurate initial conditions. When the user downloads the software, it will randomly generate initial conditions (within the domain of probable values), and run their complex weather simulations starting in the past. The program will monitor the simulation, and determine if it is accurately predicting our past weather. If the simulation is accurate for past weather, it is logical to assume the simulation will work in the future.
    In conclusion, the real goal of this project is to refine the initial conditions so that any computer running weather simulations will be able to generate a more accurate model of our planet's weather system. This is where they need our help since there are so many different combinations of initial conditions to test.
  • Wonder if every scientist could do something like this? I'm a theoretical chemist, and know of a number of problems that are trivially parallelizable (sp?), that need a bunch of cpu time. At some point we are going to get saturated with these sorts of problems. Weather prediction is cool, SETI is cool. Drug discovery is also cool. Wonder if a distributed drug design project is in order?
    -- Moondog
  • Um...I'm Canadian, and I'm a little scared about the possibility of global warming. The issue of the presence of global warming aside, if it ever did happen, it could be disasterous. Know anyone in a coastal city? Global warming would cause ice from the arctic and antarctic to melt, raising the level of the ocean. In North America, where we value ocean front property so much, this would cause vast property damage. In poorer countries, many people would become homeless.

    Plus the possibility of ice age due to lowered albedo, etc, etc, etc. So yes, global warming WOULD be that bad.

  • You don't know what the f*** you're talking about and I claim my free cigar, troll!

    FYI, ALL current climate/NWP (that's numerical weather prediction to you) models use radiative transfer models - yup, using numerical quadrature to solve scattering and absorbtion models (Rahman et al.) You'd be mental to do ray-tracing (i.e. Monte Carlo) on an entire atmosphere. Anyway, the radiation bit is only one part of a GCM - there's atmos circulation, ocean circulation and SVAT (soil-veg-atmos transfer). If you wanna run your own GCM on your home machine (providing it's a UNIX/Linux box) check out CCM3 [ucar.edu].

    Nick

  • No, even slight changes in climate can cause catastophy for agriculture. This is not a FUD.
  • by nstrug ( 1741 ) on Friday October 15, 1999 @05:22PM (#1610267) Homepage
    What's real science? Real science to me means peer-reviewed journal articles, not rantings on a newsgroup. As an active researcher in climate modelling (we produce land surface albedo and BRDF datasets, currently from AVHRR [usgs.gov] in future from TERRA [nasa.gov]) I disagree.

    In my (informed) opinion the overwelming consesus amongst climatologists and biogeographers is that climate change is real - and this is backed up by both modelled and experimental data - see Myneni et al, Nature 386 (1997) for some convincing evidence from our group.

    Of course, you are perfectly welcome to download a GCM such as CCM3 and go through it line by line to see whether it is "real science" or not. [ucar.edu]

    If you want to discuss what is "real science" or not email me or if you're in the Boston area, come round - my work address is on my web page.

    Nick

  • i do that when i go to the bathroom. and i also hit CTRL-ALT-F5 so that if anyone turns on my monitor they'll be utterly confused and leave it alone, works well.
    • Try Seti@Home, possibly discover a signal that proves the existence of alien life.
    • Try distributed.net, find the key that wins a RSA challenge and earn a couple thousand dollars to boot.
    • Try Casino-21, and prove that chaos theory is right, and that you can't make long-term predictions about weather.
  • I live in a dorm, and the big thing now are the All-Advantage.com type things where you open their add window and earn $0.50 an hour while you surf the internet. I wrote a quick http-refresh page and found them all a mouse trembler. They are basically selling their cycles to these advertising companies, although noone has seen a check yet.

    But no they don't have a linux client, so I'm sticking with distributed.net.
  • What if your home PC can't stay up for an entire year?

    Let's hope there are versions for OSes more stable than Windows 95...
  • Okay I can a reason why more and more of this is happening...Let's think about it. Personal computers are getting faster and faster -- we'll soon be seeing consumer gigahertz CPUs and so on. Further 24/7 connections to the internet are becoming more and more common with satellite, xDSL, and cable flooding the market. So in a few years we'll have hundreds of thousands ridiculously fast computers connected to the internet all the time used by parents who type letters on them once a day or something. So what we do is get all these people to run distributed computing apps like crack RC5, Seti@home, and as many others as we can get. So we'll have lots of high powered PCs clogging the internet trying to break x problem. THEN, with a little luck we should be able to crash the internet and the infrastructure will finally get upgraded!!!

    And you said these projects were a waste of time...
  • The problem is that there is an infinate number of initial conditions, however many decimal places you'd like, and sensitivity to initial conditions means that you have to have the right number of decimal places; any rounding will cause completely diferent behavior in your model than in real life. So you have a one out of infinity chance of having the right numbers. Not too good.
  • by Anonymous Coward
    I could have sworn at one point that I saw a distributed project for running drug(chemical) analyses to help fight cystic fibrosis? I just saw it in passing and further web searches have yielded nada. Anyone know of such a creature or where I might have gotten the idea? thanks
  • That's the usual error about chaos. Chaotic behavior is predictable in the large, though not in the small. The family of possible trajectories is knowable, the exact trajectory is only predictable for a limited time. The exact trajectory in meteorology is "weather", and the family of possible trajectories with their statistics is called "climate". Putting it another way, while I can't tell you whether it will snow next Christmas (weather prediction) I can tell you it will be colder (at least hereabouts) than the following 4th of July. This story, by the way, is further confirmation for me that the Brits and the Germans are doing the serious work in climate research, though. What a nice idea! Kudos to the UK team for this idea. I'll be sending some cycles your way!
  • While I don't agree with you, I understand how you feel. Global warming is scary stuff, and denial is a natural reaction to unpleasant news. You might want to check out some actual data at The National Climatic Data Center [noaa.gov] or the National Centers for Environmental Prediction [noaa.gov].
  • by lw54 ( 73409 )
    There are problems when you open source the client. People always modify the client to send in wrong data.
  • This is actually kind of funny reading this...
    seeing as the eye of hurricane Irene is just
    starting to hit my house.

    Water in the yard is, well... there's no yard.
    In fact, even though I'm on "high ground", I have
    inches of water in the back room - coming from
    what used to be the yard.

    I've only heard a couple trees come down so far.

    Oh well, I guess it could be worse... wait a
    minute, it _will_ get worse, it's only half over!
    (once the ground is saturated, it gets deep FAST!)

    It is nice to have hefty UPS's. :^)
    (yes, the one in the back room is now elevated)

    OK, I'll stop babbling now.

    Good luck to those in N.C.! This is a mild
    hurricane by wind speed, but it's dumping an
    incredible amount of water!

    Oh yeah - I would put up a webcam, but there isn't
    enough light (already tried).
  • Why not keep track of how many times we could destroy the earth with our nuclear stockpiles? Or Bill Gates' net worth? (We'd need similar software as that used to calculate the federal debt...)
  • There is a project that provides this generic framework that you're trying to describe, it's called Cosm [mithral.com].

  • Chaos is just another name for "we can't keep track of all the variables needed to compute this accurately." But with extra cycles, we have the actual computing power to do this.
    Of course, this is assuming they KNOW all the variables in the first place.
  • It's vanity, and arrogance. Mankind wants to be able to say "Mother nature take millennia to make changes like this but we're doing it in decades" and then we want to be able to say "Not only that! We can make it go back the other way too!" And as an extra bonus we can say "All those stupid people are doing it! And also Very Big Corp. of America!"

    As for you anecdotal evidence, in the winter and in the summer hardly a day goes by without the weather-guy prattling on about whatever record we broke today or the one we came within a degree or half an inch of. This happens every year. Whether the media has decided to dub the year an "unusually warm" year or a "unusually dry" year, or a "extremely cold" year. The weather changes from year to year decade to decade.
    And the news media exaggerates the problem even more. They love to start off the show with footage of downed telephone lines in the winter and sweating kids playing in hydrants in the summer. And even more they like to end the show with an upbeat story about how local teens are doing their part to stop global warming. I firmly believe that if global warming is happening we haven't even noticed it's effects yet.

    But then, I'm not a meteorologist so I could be completely off, But this is how the whole situation appears to me. It just doesn't seem possible that people could accurately simulate what's going on in our atmosphere, when the local news can't seem simulate more then a day and a half into the future.
  • This is not so much a comment about the truth to global warming - but rather a comment on the slashdot readers and the public in general.

    One the one hand people are critizing the attempt to make a valid model on the fact that any model is highly non-linear, chaotic, and hence a small perturbation on the initial parameters will result in a larger non-linear effect as time increases.

    Then we all turn around and claim that global warming ( a potential increase in the average temperature of the earth's atmosphere ) will result in the average temperature of place X increasing ( like Canada ).

    The more potential outcome is that the additional energy will result in the climate changing unexpected ways. The winters may get colder in some places. Miami might get snow. There might be an increase in hurricanes. The tornato belt might move north. The ice caps might just melt. Perhaps the summers will be warmer. Perhaps this will have the bizzare effect ( as predicted by some models ) of throwing the planet into a permanent ice age. The grape growing season in France might be shorter - or longer. Global weather patterns are very complex. And all we have to go on is a pile of collected data.




  • We are Borg. You will be assimilated. Resistance is Futile.


    Really off topic, but I wonder if this is how the borg got started? :-)
  • So we are all leaving our computers on to calculate this stuff... lets say we have 5000 participants over 1 year... (even tho they say they want ppl for 50 years)... computer+monitor == about 200watts.. thats 200 joules per second.. thats like 3x10^13 joules per year... energy sourced primarily from the burning of carbon based fules... What does this relate to in terms of temperature rise ??

    but hey, i leave my computers on all the time.. im a slave to uptime
  • by Anonymous Coward
    Check out Junk Science [junkscience.com].
  • Are the models from 1950 accurate enough to use? Weather technology wasn't as good back then I'm sure. I mean those guys had to cook spegetti-o's on the stove.
  • I hope that this thing takes the better part of the year to finish and when it's all done, it shows global warming for the farce that it is.

    I wonder if that would cause them to brush the results under the table and forget them.

    -Augie
  • As I recall, almost all of the long-term atmospheric models predicting significant global warming assume that all clouds are opaque, and therefore will enhance the greenhouse effect. Since clouds vary in the amount of long-wave radiation that they let escape back into the atmosphere, the models will be quite pessimistic compared to reality.
    As for the evidence of global warming, we have been keeping weather records for an incredibly small amount of time compared to the life of the earth. Most weather stations are at airports, so these records will have to be shorter than a century. The thermometer itself is a relatively recent invention compared to the life of the earth. So, the current trend is way too short to use for long term prediction.
    We also have very few observing stations over the open ocean. We have stations on a lot of islands, but the center of the North Pacific is uncovered except for ships, and most of these do not take upper atmospheric observations. Satellite technology has improved this tremendously, but it doesn't substitute for a surface and/or upper atmosphere observing station. Who knows, perhaps tracking open-ocean weather would have resulted in the average trend being toward global cooling?
    I personally don't believe that scientific attention is proof of an effect. Centuries ago, a lot of attention was paid to how the sun and other planets circled the Earth.
  • by mcc ( 14761 ) <amcclure@purdue.edu> on Friday October 15, 1999 @07:22PM (#1610299) Homepage
    with so many of these projects (RC5, SETI, this, distributed.net) it seems sad that every time someone does a new distributed computing system they have to reinvent the wheel and make a totally new client from scratch.

    would be nice if somone would just write a protocol and open-souce client for the darn thing. like, some way to send out abstract work blocks of various kinds over a network, processed with spare cycles, and have those work blocks returned.
    it would have to be something very abstract, or at least extensible, so you could easily swap projects promiscuously without downloading and configuring a new client. and it would have to be processor-agnostic, maybe just put in a bunch of mathematic instructions (though a VM of any kind would just be stupid) and a standard way of parsing them. although you'd want to put in hints (vector instruction here, floating-point instruction here) so that things like MMX and altivec and 3d cards could be used to their full potential. And there would have to be _very_ clearly defined limits on the way they can access the hard drive, and ways to make sure it doesn't interfere with other applications. And it might need to try to make sure it only consumes network bandwidth if it isn't taking away bandwidth being used by something else. I dunno how you'd deal with the question of whether the work blocks are getting returned correctly; only thing i can think of is extreme redundancy. Send out all work blocks two or three times, if there's _any_ difference in the returned blocks redo it and maybe put the computer that returned the bad block on a list of computers not to trust. So it would be kinda complex to make a generic protocol instead of a specific implementation..
    but wouldn't that be COOL?

    especially if it wasn't just an internet thing, but a generic network thing; we have a _lot_ of computers at the school just sitting there all day waiting for someone to ctrl-alt-del and put a username in the login box. would be nice if they could be put to some meaningful use in their downtime.. like just say on one computer "rip this mp3 for me", or like an entire queue of mp3s and 3d renderings or whatever, and have all the computers on the network not in use do the work while i continue using the computer i'm on. 'course the network admin might not be too happy about his entire network being turned into an mp3 encoder, but hey, he doesn't need to know about it. It's his own damn fault for using NT, esp. without reading the damn manual..

    distributed.net seems to be using the more use-a-specific-client-for-each-specific-task tactic, but maybe they could be raided for useful source code..

    (p.s. mp3s are a hypothetical example, of course.. i wouldn't actually do that, that would be illegal! Riiight..)

    -mcc-baka
    INTELLECTUAL PROPERTY IS THEFT
  • by Anonymous Coward
    what the heck makes these people think we'll be able to talk to zorg the martian who communicates by shooting streams of cyanide out of his eyeballs

    Yeah, but all you need to do is hook Seti@Home up to an iSmell, the networked smell-o-vision. Nothing says "We come in peace" like a well modulated puff of bitter almonds.
  • For your interest, the technical term in atmospheric studies is called ensemble forecasting, Monte Carlo is the generic class of techniques for sampling a large state space which is not strictly the same as ray-casting for scene generation. If you've got the inclination, check out

    Prospects and Limitations of Seasonal Atmospheric GCM Predictions, Kumar and Hoerline, Bulletin of The American Meteorological Society, Vol 76(3), March 1995.

    which discusses the predictability of time-averaged GCM runs. It all comes down to the spatial-temporal correlations as most of hte time, people are more interested in the anomalies (ie extreme events) rather than the natural variability. The science is still out on this area.

    You don't know what the f*** you're talking about and I claim my free cigar, troll!
    You're quite welcome to smoke your own cigar.

    LL
  • A future project:

    Sometime in the not so near future (say, ten years from now when distributed.net cracks rsa64) we might have a desire to simulate large portions of the human brain. With network response times on the order of msecs it seems possible to locally simulate a tiny bit of a brain and compile the results to meet or exceed the processing power of a human's 10^11 neurons.

  • SETI@Home - we come in pieces.
  • I've run some calculations on all this myself & am now making the formula available to all, it is platform non-dependent & is easily programed into even a spreadsheet. Here it is:

    Casino-21 Project + distributed.net + GIMPS + The Federal Deficit Counter + Slashdot + World Resources Destructo-meter + The Salvation Army + d.net's OGR project + NASA + SETI@home + 1 = fortytwo
  • Hindsight is an exact science.
  • Austin, TX. Hot, dry, 75-90% of the time. Locals can forcast that with 75 - 90% accuracy. Houston, TX. Hot, Humid, 90% of the time. Duh.
  • While this may be a bit off-topic, with the mention of distributed.net [distributed.net], I thought some people would be interesting in helping the organization. By simply signing up to iGive (http://www.iGive.com/html/ssi.cfm?cid=1098&mid=91 085), distributed.net gets $10. Then, a certain percentage of all things you buy go to distributed.net and even clicks earn money for them.

    While I know that I can't donate money to them, it is nice to be able to pay them back somehow. If others are interested in helping the organization, join iGive now [igive.com]!
    --
    ZZWeb.net Web Hosting - http://www.zzweb.net
  • It is in fact true that around 1970 *some* climate scientists were voicing *some* concern about an ice age being about due. This was based mostly on statistical arguments. There was also an issue as to whether human particulate emissions might be hastening the trend. I won't burden this list with why these statistical arguments were quite weak. The press ran with this story, and most public libraries over 30 years old have a scare volume called "The Cooling", by L. Ponte, which tried to get the public in a panic about this.



    Around 1980, physical arguments replaced handwaving. People started to do the numbers, and found that human greenhouse emission warming was likely to overwhelm human particulate emission cooling as well as natural cycles.



    It is currently the overwhelming consensus among the relevant sciences that a doubling of background CO2 (expected in the next few decades) will amount to a disequilibrium of 4 watts per square meter heating at the surface, worldwide, until climate changes to restore the equilibrium.



    The exact ways in which climate will change are the subject of the modeling efforts. Whether climate will change substantially is no longer an open question among actual scientists, though there are lots of paid flaks in lab coats, some showing up here at /. , happy to try to convince you otherwise.



    All current evidence shows that the intuitive conclusion that an extra 4 watts at the surface will lead to warming at the surface is correct, though there is about a factor of 4 argument about the amplitude of the change. The serious discussion, were society capable of such a thing, ought to be about how much we care.



    Finally, this is not a small perturbation. One simulation has the climate of southern Wisconsin in 2070 about equal to that of Oklahoma today. The simulations are crude guesses in some ways, but they are the best we can do. The rates of change envisioned are about an order of magnitude larger than those seen in nature.

  • Climate Prediction? Breaking encrpytion keys is one thing but climate prediction? I'd rather have my computer idle that neven waste the electricity predicting climates!!!!!!
  • Your definition of Chaos merely means that you do not understand the topic. Dr. Lawranz (sp?), of MIT, proved that the problem has nothing to do with the 'number of variables' but on the idea of non-linear reactions. The "straw that broke the camel's back" sort of thing, or the butterfly flapping in Japan that causes a Tornado in Nebraska. Nova did an excellent piece on the subject for laymen and you can probably find it at your local library. It is worth the hour.

    People that claim to be able to model the weather for predictions months or years into the future fall into two catagories: those who don't understand Chaos (or modeling) and those that have an politcal agenda. The former are merely wasting money and time. That latter sponser agi-prop events like the 2000 'environmental scientists' that signed the "Global Warming" letter a few years ago. It turns out, when their credentials were checked, that only 15 were actually weather scientists. Most were just the usualy assortment of leftist cause-joiners. A couple of months later several thousand (near 10 if I remember correctly) actual weather scientists, meteorologists, etc., signed a letter stating that Global Warming was a myth. It got no press coverage by the major media.
    It is interesting to note that WATER vapor has SEVEN TIMES the green-house power of CO2.
    It is also interesting to note that the same folks who are pushing Global Warming scenerios were the ones pushing Nuclear Winter twenty years ago. Then we were going to freeze to death if the H bomb or Population bomb went off. Their solution to both "problems" is identical - nanny government socialism and removal of individual liberties, for the children, of course. Same folks, same tune tune with different words but the same agenda.
    For folks who are really interested there are several books on chemical cycles and the global environment published before the politically correct virus infected academia) which detail the players: CO2, water, bicarbonates in the ocean, phtotsynthesis, etc., and give reaction rates (k values), with estitmates for cycle times. Reputable books also state their assumptions and define their constants, siting reproduced experimentation.
    Don't let them yank your chains, folks.
    JLK

  • There is a glaciation cycle, and the earth is getting warmer. Greenhouse gasses in the atmosphere do heat the earth. The effect of people burning carbon is not yet known, but it is obvious that it could be a disaster. Here is a link for snobbish American scientific types: http://web.mit.edu/globalchange/www/
    Our hero Bruce Sterling is a bigtime FUD'er on this issue, and he didn't answer my question last week!
  • Just how many different major projects can the market currently support? Theres a finite number of computers and an infinite number of tasks which they can be applied to. Since there seem to be more and more of these massive volunteer distributed projects coming up, is it possible that the user-market will become saturated any time soon?

    I personally think it is possible but that it will not happen within the next couple of years. After that, however, the amount of processing power that these distributed projects can amass will probably remain relatively constant (increase processing power offset by less users). On the other hand, the computer-user explosion might continue on for many years to come. So making a forecast on this is like basing a countries budget on a five or ten year financial forecast.

    Sigh.

  • There's a difference between SETI@Home, and this. Climate models are quite dependent on the initial conditions you give them (read chaotic). You could dedicate all the computer power in the world to something like this; you'll extend the time frame we can give reliable results somewhat, but after a while, they'll be just as meaningless as if you used a single computer.

    I think I'll stick to some other things like this.

  • by Joe Rumsey ( 2194 ) on Friday October 15, 1999 @02:44PM (#1610317)
    So, you get to use your "spare" cycles to help determine just how much the energy you're burning is going to damage the environment. Hmm. I'd like to see some estimates of which will do more good for the environment - participating in this program, or just turning your computer off.
  • I disagree, both SETI@Home and Climate prediction are worthy causes. Neither more important then the other.

    That's my $(2^4*3+1/7%3*2/100)
  • From the page:In brief, the experiment calls for you to install a unique, state-of-the art climate model on your home PC and keep it running, possibly for a year or more.

    Wow, that's quite a commitment. I haven't made that kind of commitment to some programing languages. ;)

  • by Hrunting ( 2191 ) on Friday October 15, 1999 @02:46PM (#1610320) Homepage
    • The Federal Deficit Counter
      We're above 4 trillion dollars and the computers designed to consistently add US$0.01 to the total were built back in 198x, so they must be having problems. A few more CPU cycles and our deficit counter can go up faster than ever!
    • Slashdot
      Ever notice how Slashdot slows down sometimes? Ever think to yourself, "Man, if my spare CPU could go to speeding this bad puppy up"? Well, now you can speed up the experience with the only distributed client that actually sends CPU processing to Slashdot (warning: still in beta, no ETA).
    • World Resources Destructo-meter
      Ever wonder how much consumable energy we have left? Well, with the World Resources Destructo-meter, you can help keep track of how many of our precious resources are left! As an added bonus, the more and more computers use this program, the more and more energy is used and thus, the program counts down faster and faster. It's fun for the whole family!
    • The Salvation Army
      Ever given something you don't want to the Salvation Army? Ever wondered how anyone could use whatever it was you gave them? Now you can do the same thing with your CPU cycles! (NOTE: the Salvation Army is also selling used CPU cycles in their thrift stores for older computers).
    • NASA
      NASA may not seem like it needs more CPU cycles, but, as current events recently showed, the CPU-intensive conversions from standard to metric and back again sometimes hit a brick wall. Donate CPU to NASA and make sure that we don't lose anymore multi-million dollar probes!


    Brought to you by The Computer(tm), now with new games that you play when you're not using your computer.
  • by LL ( 20038 ) on Friday October 15, 1999 @02:50PM (#1610321)
    While it make be hot and sexy to participate in a feel-good global climate project, I'd like to see more of the scientific methodology first. Anyone can run monte-carlo simulations (in fact too many do so for market forecasting) but the underlying science still has to be validated. While it may save them buying a Cray or 10, will it achieve any useful results? I recall a project which simulated the effect of climate change across a forested German country-side and what eventually hit the papers bigtime was the most extreme scenario where all the trees were killed off. Essemble forecasting can usually pick the extreme events but the normal one are trickier to analyse. Extrapolate this across the globe and you will always find a scenario where you are personally affected (wheat belts moving into Canada, sea levels rising, higher winter fuel costs, etc). There is a very good reason why science has to go through peer review first and to be especially sceptical of simulated results which makes a lot of simplifying assumptions (which would be necessary to fit into a PC memory).

    So what will the climate be like next decade? All anyone can say really say is that it might change. Any attempts at scare-mongering or trying to protect vested interests will only be a distraction to putting in the fundamental research in trying to gain a better understanding of the world's climate dynamics.

    LL
  • by yoshi ( 38533 ) on Friday October 15, 1999 @02:54PM (#1610322)
    Let's look at the various distributed computing projects:
    • SETI@home [berkeley.edu] : a neat, geek-friendly, worthy cause, but hardly practical, and they have plenty of CPU - they don't need mine. This project is currently running, and while they had growing pains a while ago, they have been solved. One problem remains (as far as I know); they client still slows to 1/3 speed if you have the visuals turned on.
    • distributed.net encryption cracks [distributed.net] : for a long time, the most practical distributed computing projects around, and certainly the most advanced, but I think that they (and others) have clearly proven the point re: encryption (i.e. that we need access to stronger encryption), and while we don't have perfect regulations, one more crack ain't gonna do it. These are also running, and have been for years now. They had some problems with stats a while ago, but they haven't had a problem of not sending out new work in a long time (if ever).
    • d.net's OGR project [distributed.net] : good scientific research, certainly, but really, just giving some grad student thesis fodder. You're not gonna solve world hunger or anything. Also, it isn't running yet.
    • Casino-21 Project [rl.ac.uk] : wow, some practical application. Also very different from other projects, because it is devoted not to "solving a riddle", but to predicting a complex system. Of course, it's not running yet, either.
    I'm sure there are others, but that should cover the big ones, and all of the major categories. For now, it's d.net rc5-64 for me, but as soon as Casino-21 starts, I'm going to switch. I hate to sound like a whiny earth-tone, but I have to say I think the environment is important, and certainly is much more proximate to me than aliens (I think) (I hope, too). This project may not solve global warming, but as GI Joe says, "Knowing is half the battle."

    -Yoshi

  • by bughunter ( 10093 ) <(bughunter) (at) (earthlink.net)> on Friday October 15, 1999 @02:55PM (#1610323) Journal
    I dunno. I've always been skeptical of the lack of certainty atmospheric scientists seem to exhibit with respect to their claims of global warming... first it's getting warmer, then it's getting colder. The only thing we know for certain is that we don't seem to know much for certain. (Besides, would global warming be all that bad? Ask a Canadian.)

    Take a gander at John Daily's page, Waiting for Global Warming [vision.net.au], from where you can follow links to NOAA and NASA evidence suggesting that the global warming phenomenon is really systematic error in the climate record.

    But by all means, if you don't want to look for ET reruns of "I Love Lucy" analogues, then pitch in on a decent atmospheric model. If we're actually headed for an Ice Age, it'd be nice to know in advance.

  • Why not kill two avians with a single rochier?

    We could help map the human genome with our spare cycles, under GPL. Then we don't have to worry about Monsanto and other companies patenting segments of our genetic code ...

    [caveat - I own 500 shares of Monsanto - wanna buy some?]

  • Most modern meterological stations are within cities, resulting in higher measured temperatures due to residual city heat than stations in more isolated areas.

    Atmospheric measurements from satelites in 1998 showed very COLD temperatures---the coldest winter since these measurements began. (found this in the NOAA climate database)

    I'm not saying there's nothing to global warming. However we are still emerging from a mini-ice age 10000 years ago. Deserts are expanding. This is what is expected. Perhaps we're helping it along a bit.

    I wrote a paper for a class on this. If I could only find it electronically right now I'd post it.

    "global warming is not a figment of the imagination, otherwise why would there be such a stir about it, especially among the scientists who have been studying it the most? "

    Two words: MONEY and GRANTS. If your field is considered important to society, you get much more funding.

    I saw a reference (but can't recall where, Scientific American?) about atmospheric models. Until recently the effect of the CO2 storage capability of the ocean has not been considered. In addition, as someone else mentioned, clouds could not be modelled (and still cannot be).

  • I'm glad somebody mentioned chaotic behaviour. While this sounds like a noble cause, the mathematics behind the idea is not based on sound principles.

    There is no evidence that weather and climate patters are not chaotic, and plenty of evidence that they are. No matter how fast your computer is, no matter how accurately you measure your initial parameters, no matter how comprehensive your model is, you simply cannot predict the future state of our atmosphere beyond a day or so (and usually we can't even do that). One of the properties of chaotic behaviour is that arbitrarily small varations in input parameters eventually cause perterbations that exceed the strength of the signal you're trying to predict. When those variations are introduced by measuring a physical property (which cannot be done exactly), it's just a matter of time before your prediction breaks down.

    They say they will use the data for the last 50 years to "calibrate" the model. Or use it as input for a genetic algorithm or something. But as the typical disclaimer goes, past history is not representative of future performance. Suppose that a given model predicts the past 50 years of climate reasonably accurately. I would submit that this would happen only by chance, and that the same model would be meaningless for the next 50 years.

    Why don't they try creating an algorithm that will predict the climate changes just between 1950 and 1975 (25 years)? Then see how that same algorithm does on the next 25 years (1975-present). Perhaps there will be a number of algorithms that match the first 25 years reasonably accurately, and maybe one or two that end up matching the next 25 years too. How would you have chosen the best one 25 years ago? What makes them think they can choose the best algorithm for the next 50 years, today?

  • The farce that it is? Right, perpetrated by the political left in order to curb the profits of the oh-so environmentally friendly corporations of the world? *lmao*

    Why not sign up for the project? Then if they do shelve the results at the end of the project, announce it back here. If they don't, announce the results anyways.

    Or is it a case of "Oh, they just fixed the data to show that the globe is warming up, buncha pinkos..."

    -Chris
  • Because of the "sky is falling" reactions I've seen in this forum, I've gathered a few interesting links. Have a look. (/. appears to have mangled the text in my message, but the links all work, although they look funny).

    http://science.msfc.n asa.gov/newhome/essd/essd_strat_temp.htm [nasa.gov] - globally averaged atmospheric temperatures - troposphere and stratosphere. [nasa.gov]

    http://sc ience.msfc.nasa.gov/newhome/headlines/notebook/ess d13aug98_1.htm [nasa.gov] - Unexpected results from satelite measurements. [cgcp.rsc.ca]

    http://www.cgcp.rsc.ca/en glish/html_documents/whatis.html [cgcp.rsc.ca] - climate change in general, including long term and "global warming". General background. [noaa.gov]

    http://www.ncdc.noaa.gov/onlineprod/p rod.html [noaa.gov] - the blank areas on the map on the left are where there are no measurements. Most of the ocean... Globally averaged atmospheric temperatures in the map on the right. General upward trend, but a LOT of fluctuation.

  • How the heck did the quite witty post above get marked down to 0, and listed as "Troll" none the less?


    Here's the joke explanation for our not-so-bright moderators: Seti@home sends fragments of the data they've collected to the clients for processing. The clients are looking for signals in these PIECES of data. This combines with the play on "We come in PEACE".


    Idiot moderators are even more annoying than the J. Random Trolls. SOMEBODY fix the score on the above post.
  • I don't jump on conventional wisdom bandwagons.

    I've seen more than enough reports portraying global warming for being a farce that I refuse to participate in any of the hype and meaningless "feel good" projects surrounding it. I will continue to drive my internal combustion engine-powered car, despite Al Gore's claim that it's the single most dangerous weapon in the world. I will continue to be awed when a volcano erupts, despite global warming wacko's claims that Mount Pinatubo's eruption caused a 5% shrinkage in the ozone layer "hole." (If that were true, we'd have zero ozone, given the age of the earth and the number of volcanic eruptions.) etc. etc.

    Don't believe the hype.

    -Augie

    P.S. And thanks to modern firefighting techniques, we also have more trees in North American now than ever. Forest fires don't burn out of control anymore and don't eat up states' worth of trees. But that's a whole 'nother topic.
  • I think many people are suspicious about the claims for global warming because of the politics and agendas of its proponents.

    Too many politicians and activists have latched on to it as a means of pushing their pet causes. Does anyone think that Al Gore really is interested in scientific truth?

    Another problem is the past history of "environmental crises" that got a lot of press and rhetoric, only to fade away when the disaster failed to materialize.

  • especially if it wasn't just an internet thing, but a generic network thing; we have a _lot_ of computers at the school just sitting there all day waiting for someone to ctrl-alt-del and put a username in the login box. would be nice if they could be put to some meaningful use in their downtime.. like just say on one computer "rip this mp3 for me", or like an entire queue of mp3s and 3d renderings or whatever, and have all the computers on the network not in use do the work while i continue using the computer i'm on. 'course the network admin might not be too happy about his entire network being turned into an mp3 encoder, but hey, he doesn't need to know about it. It's his own damn fault for using NT, esp. without reading the damn manual..

    Someone needs to read up on Beowulf. It's already there, for crap like MP3 ripping, 3d work, etc...

    (p.s. mp3s are a hypothetical example, of course.. i wouldn't actually do that, that would be illegal! Riiight..)

    Umm, MP3s aren't illegal, distributing them is. If you own the CD, you can use MP3s of the songs.

  • or am i missing something?

    Nope, makes perfect sense to me. The later adding of parameters to gradually increase accuracy sounds good in theory, but in a chaotic system (such as the atmosphere) a seemingly small effect can make a dramatic difference (though the butterfly causing a hurricane is a rediculous exaggeration).
    My problem with these climatic models, is that they can always be altered to fit the last 50 years perfectly, yet be completely wrong in every other aspect.
    For instance (random, imaginary example) if the avarage global temperature roughly follows a sinewave, with a period of 10000 years, at any one period you could look at the last 50 years, and see a perfect fit for a straight line which suggests a rise or decline in avarage temperature, while if you had had the data of the last 20000 years instead of 50, you would see no change at all.

    -----
  • to a fellow member of the Hurricane of the Week Club.

  • A very carefully selected list of sites indeed.

    The graphs at that Canadian site are particularly inexcuseable - were they drawn freehand? For those of you who'd like to see a peer-reviewed graph of the recent global mean surface temperature (published in _Nature_, and showing confidence bounds) see this U Mass press release [umass.edu] and especially this figure [umass.edu].

  • A consensus among american climatologists, may be disputed by european physisists and vice-versa.

    The change in climate is not disputed. What is disputed is the casual certainty with which it seems to be blamed on man caused CO2 emissions.

    I am not an anti-environmentalist. I feel it has been proven beyond any doubt that the hole in the ozone layer is caused by human polution. Acid rain is very real too, and almost completely caused by the burning of fossil fuel.

    However, casually blaming the rise in global temperature on the rise in CO2 levels, raises questions. In the frequency band in which CO2 absorbs infrared (heat) radiation, absorbtion in the atmosphere is already pretty much 100%. Furthermore, the IR absorbtion of CO2 must surely be dwarfed compared to the absorbtion by water. Both on the earth's surface, and in the atmosphere.

    A few years ago, as I'm sure you know, a Danish team showed that the change in global temperature had a very strong correlation with changes in solar activity.

    The strong correllation between CO2 levels and global temperature found in geological records, makes no statements about cause and effect. Rising temperatures could cause CO2 levels to rise (seems logical), or that both share a common cause.

    If you wish to flame, convert or enlighten me, rve can be mailed @ null.net
    -----
  • IANA climatologist, I have been wondering about something for a long time. Every mass media scare story I see about global warming eventually focuses on one thing: coastal cities sinking into the ocean, killing millions of people. Pictures of people surfing amongst the skyscrapers of Miami, Boston, and New York, etc. It seems to be accepted that when global warming occurs, the oceans will rise. Now, my question is: Why!

    The polar ice caps are made of ice. Huge chunks of ice, with a few rocks and other miscellaneous debris thrown in. The vast majority of these ice blocks are underwater, which means that when they melt, they will probably lose at least as much ice to the ocean from beneath the sea as from above. Since ice has a lower specific density than water, the net effect will be that there will be less liquid volume in the ocean after the ice melts. If this is correct, then the oceans will recede. (Confirm this yourself by dropping a bunch of ice cubes in a cup of warm water and measuring the water level before and after they melt.)

    Can someone in this forum please explain to me why I am wrong? I know that the obviously intelligent and well-funded environmentalists would never lie to me about the effects or causes of global warming, so...

    thanks in advance.

    Scudder
  • We don't care if they shoot cyanide, just as long as they emit radio signals.
  • Not necessarily. Unless they tune the algorithm badly to make it reproduce well today's climate, you can reasonably assume that it (the algorithmic model) is generic enough to apply to the future as well. Of course that would be modulo any catastrophic events between now and then... (massive volcano eruptions, forest fires etc.)
  • MSNBC also has a similar story about the project http://www.msnbc.com/news/322736.asp [msnbc.com]

  • by crayz ( 1056 )
    I got a client to figure out that estimate of which is better, it'll be out RSN
  • This shows only that projects like GIMPS (which IMHO beats distributed.net), distributed.net and SETI@Home has done what part their original intention is: Lead the way for other distributed efforts.

    Without having looked at the article, though, it looks like this will be much less accessible for the masses. GIMPSers (check http://www.mersenne.org [mersenne.org] if you're interested) have months or even years to complete a single assignments -- this one sounds like it will need the data in time. In other words, only 24/7 (or almost-24/7)-online computers will be able to participate effectively. I wonder how it will cope with having a variable amount of information available as well. OK, off to read the article :-)

    /* Steinar */
  • Hey, if we all donate our machines to helping each other compute, pretty soon we'll each, on average, be able to tap into all of the computing power that.... hmmm... sits in front of us on our desks.

    well, I do use a text console a good bit, so you can have the spare DSP cycles in my video card.

  • even if they do use the past 50 years as an example, there's still so many variables, and just because it's been one way for the past 50 years doesn't mean it'll be the same for the next 50, if that was the case then there wouldn't be any question of what the climate will be in 50 years. What about the whole "butterfly effect" thing, I mean, yeah, it's a far fetched idea, but the point is every tiny little variable in the long run can have a huge effect on what the weather is going to be.. it's just a guess.. basically at the end of this project they're going to be like the weatherman that they describe who has different weather reports that contratict themselves, and shrug their shoulders and guess.
  • how about a cash incentive or a tax reduction? if im gonna support a gov. funded project i should get some gov. funding myself.
  • The Experimental Rocket Propulsion Society (ERPS) [erps.org] of Silicon Valley has been planning some similar distributed environments for flight simulations of its amateur rocket designs ever since it was announced that SETI@Home didn't have enough work for all the cycles offered to them. We expect a much smaller audience of people who will volunteer to help us but it should still be better than entirely running them on our own machines.
    Ian Kluft

    ERPS Flight Control System team leader
  • by Anonymous Coward
    Don't worry. My computer is powered by my body's own electric field. Unfortunately, I'm trapped in a computer-generated virtual world.
  • I think the Great Internet Mersenne Prime Search [mersenne.org] (GIMPS) should be included in this list. We're (I'm a part of it) have a couple thousand users going, and we're one of the few projects who actually have results to show (the three largest primes known to mankind, in just a few years). It has some advantages over some of the others (like very low bandwidth requirements), but it has some disadvantages as well (no fancy GUI, just a working one), but this is not intended as a comparison between those five.

    /* Steinar */
  • If you're really concerned about global warming, it seems to me that shutting down your computer at night (or the monitor, at least) would be much more helpful.

  • SETI doesn't rely on us being able to talk to them; it just relies on us being able to detect radio emissions and demonstrate whether or not they contain patterns that indicate they weren't generated by natural processes.

    Figuring out how to communicate with them is a totally seperate process that might or might not follow detecting them.

    Really, communicating with them isn't the point; any such communications would take many years, and by the time anything useful could be said we'd have probably figured it out on our own.

    SETI is about proving zorg the martian exists, not asking him how the wife and larvae are doing.
  • You're not taking into account that the project is just getting off the ground. Considering it was around two years for SETI@home to go from the first public announcement to the release of version 1.0 of the application, the system requirements listed could very well represent an old box in the corner gathering dust by the time this project is ready to get running.

    What I found surprising was the requirements survey didn't ask your connection speed. They should assume by the time they launch we WILL all be on a high-speed network. They mention several times that they will likely have to send out the application on CD-ROM unless you have a very fast connection, but if it does take two years to launch, hopefully most people will have ADSL or cable modem access, if they aren't connecting via a work or school account.
  • Just like the carefully selected statistics often shown to support global warming.
  • by Anonymous Coward
    More info about things like global warming can be found at junkscience.com [junkscience.com]
  • by Alan Shutko ( 5101 ) on Friday October 15, 1999 @03:46PM (#1610367) Homepage
    This isn't a case of donating spare cycles to a project. This is a case of donating spare COMPUTERS.

    Take a quick look at the registration form. The minimal machine they're interested has a 400Mhz CPU, 32MB RAM, and 256MB disk space they can devote to the task. They'd prefer that you have a 600Mhz machine with over 2GB to devote. They say themselves that the system requirements are about equivalent to a state-of-the-art game... a game you'll be running for about a year.

    Unlike distributed.net, I'll bet this is going to have serious repercussions on your usage of your own computer. (After all, it'll take quite a bit to swap out when you come back to your machine.) And what do you get for this? Well, in 50 years we'll know if your random simulation was better than any random simulations. At which point...

    If you'd like to participate in this, you'd might as well fedex your machine to them so that they can put them on a high-speed network and beowulf them. With the kind of data they need, the network latency is going to be bad... (distributing things by CD?!)

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...