Supercomputing and Climate Research 117
Mr. Obvious writes: "It must have already been submitted, since the article is over a day old (gasp!) but there's a good
round-up
on the state of the art in supercomputing, as it applies to modeling the weather --- that is to say, modeling the planet --- over at the NYTimes. They go into lots of interesting things concerning how hard it is, what progress has been made lately, why the US researchers feel themselves to be hamstringed in comparison to those in Europe or Japan, and even into some things you probably didn't know (I didn't, at least) about the weather."
Re:Weather and Chaos Theory (Score:1)
Also, I doubt that an adaptive grid would be necessary. Most of the medium scale effects (smaller than the model grid, but larger than say, convective storms) are related to terrain, proximity to water, etc. They're static, so you could handle these with just a fixed variable density grid rather than an adaptive one. Don't ask me why the models don't use a variable density grid already. I wonder about that myself.
Re:Supercomputer Envy (Score:1)
That's not true. Discretizing and solving PDEs using finite differencing on a regular grid boils down to little more than solving a linear algebra problem iteratively. And that's exactly the sort of thing LINPACK does. Besides, vector computer architectures are specifically designed for doing fast matrix-vector operations, which happen to be at the root of both LINPACK and weather and climate prediction. So if a distributed, message passing architecture like those on the top of the list can run LINPACK faster than a vector architecture, than it can run weather & climate models faster too, assuming they are coded to take advantage.
Yes, those are important issues, but the LINPACK results prove that they can be overcome. IMO, the problem is entirely a matter of programming. I have personally endured the pain of working with code written by the guys in Boulder, Princeton, and DC. And, unfortunately, it has been the sorriest code I've ever seen from professional programmers. These people can't even be bothered to learn basic programming practices, let alone modern parallel architectures.
So, I certainly agree that it is much easier to program for a shared memory vector computer. But that's a poor excuse. The people in the weapons simulation business and the aerospace business have learned how to take advantage of other architectures. It's about time the climatologists learned how to do it too.
weather@home? (Score:1)
Information wants to be registration free (Score:1)
Climate Research: The Devil Is in the Details
By ANDREW C. REVKIN
Multimedia: http://www.nytimes.com/images/2001/07/03/science/
n 1922, Dr. Lewis Fry Richardson, a British physicist with a penchant for grand ideas, described how to forecast the behavior of the atmosphere.
He had details wrong but the basic concept right: a suite of equations that, when applied to measurements of heat, cloudiness, humidity and the like, could project how those factors would change over time.
There was one grand problem. To predict weather 24 hours in advance, he said, 64,000 people with adding machines would have to work nonstop for 24 hours.
Dr. Richardson pined for a day "in the dim future" when it might be possible to calculate conditions faster than they evolved.
That dim future is now. But while much has changed, much remains the same.
Supercomputers have answered Dr. Richardson's plea. Weeklong weather forecasts are generally reliable. But long-term climate predictions are still limited by the range of processes that affect the earth's atmosphere, from the chemistry of the microscopic particles that form cloud droplets to the decades-long stirrings of the seas.
With its oceans, shifting clouds, volcanoes and human emissions of heat-trapping gases and sun-blocking haze, earth remains a puzzle, said Dr. Michael E. Schlesinger, who directs climate research at the University of Illinois at Urbana- Champaign.
"If you were going to pick a planet to model, this is the last planet you would choose," he said.
So even as the evidence grows that earth's climate is warming and that people are responsible for at least part of the change, the toughness of the modeling problem is often cited by those who oppose international action to cut the emissions of heat-trapping gases.
And while American research centers once dominated this effort, they have recently fallen behind others overseas.
By many accounts, the dominant research effort is now at the Hadley Center for Climate Prediction and Research, 30 miles west of London. More than 100 scientists there are using extremely powerful computers just to explore long-term questions. Several recent studies by the National Academy of Sciences found that other countries had provided superior supercomputers for advanced climate research.
The academy found that efforts in the United States were hurt in the 1990's by a Commerce Department tariff of 450 percent on Japanese supercomputers. The tariff was lifted this spring.
The results are vexing for American scientists, said Dr. Maurice Blackmon, director of climate studies at the National Center for Atmospheric Research in Boulder, Colo.
Last week, Dr. Blackmon said in an interview, he met a climatologist from a Swiss university who was preparing to run a copy of the Boulder laboratory's most sophisticated model on a supercomputer in Bern "six to eight times faster than we can here."
"That's the definition of frustration," Dr. Blackmon said.
Even with the best computers, though, important parts of the climate puzzle still elude both the machines and the theoreticians, although progress is being made.
Dozens of mathematical models of the atmosphere and things that affect it are being applied to the problem. The most ambitious of these about 20 or so around the world simulate not only the air but also the oceans and, increasingly, other dynamic features of the planet: its shifting sea ice and glaciers, its cloak of vegetation, its soils.
These imagined earths are generated by supercomputers that tear through decades in a day, creating a compressed view of how the climate might behave if one influencing force or another changed.
The biggest models have improved substantially in the last few years, with many no longer requiring "flux adjustments" essentially fudge factors that were once needed to prevent the machine-generated, theoretical climates from drifting out of the realm of the possible.
The signal achievement in recent years has been the accumulation of evidence, much of it from advanced models, that rising levels of greenhouse gases in the air have discernibly warmed the planet.
But moving beyond that general conclusion presents enormous problems.
"We will of course improve our models," said Dr. Mojib Latif, the deputy director of the Max Planck Institute for Meteorology in Hamburg, Germany, "but I don't really see the biggest or most important results changing in the next 10 years."
"In terms of policy," Dr. Latif said, "the models have done their job."
But the models have not clearly answered a pivotal question: how sensitive is the climate to the intensifying greenhouse effect? In other words, how big is any coming climatic disruption likely to be?
The models still predict essentially the same wide range that was calculated nearly 30 years ago: roughly an average rise of 3 to 8 degrees Fahrenheit if greenhouse gases double from the concentrations measured before coal and oil burning and forest cutting significantly altered the atmosphere.
And that is a global prediction. When asked to predict local effects of global warming say, on the Southwest or Europe the margins of error grow, and competing models stray far and wide.
For example, the change in climate in particular places in the models still varies markedly depending on how programmers start the simulation what values they pick for the initial conditions on earth.
The first set of numbers plugged into the matrix of equations is always an educated guess, said Dr. Curtis C. Covey, a physicist at the Lawrence Livermore National Laboratory who compares the performance of various models.
"Can you tell me what the initial conditions were in 1850? Can anybody?" Dr. Covey asked.
In fact, some top modelers say even the most powerful simulations can be pushed only so far before they reach limits of usefulness.
Dr. Syukuro Manabe, who in 1969 helped create the first model coupling the atmosphere and oceans, said in an interview that the most advanced versions had already gone too far.
"People are mixing up qualitative realism with quantitative realism," said Dr. Manabe, who did most of his work at the Commerce Department's Geophysical Fluid Dynamics Laboratory in Princeton, N.J. He is now helping Japan create a $500 million supercomputing center in Yokohama that is expected to dwarf all the other climate research efforts.
He explained that models incorporating everything from dust to vegetation looked more and more like the real world but that the error range associated with the addition of each new variable could result in nearly total uncertainty. Speaking of some climate models, he said, "They are more caught up in trying to show what a great gadget they have than in showing how profound their study is in understanding nature."
Of course, Dr. Manabe said, the models still play a vital role in earth science, providing practically the only means of looking into the future, albeit through a cloudy lens.
And there are still many ways to sharpen the picture, he and other climate experts said.
First, there is improving resolution and speed. Though climate modelers use the same machines that help nuclear weapons designers and astrophysicists, they still face a big trade-off between detail and time.
The most advanced models consist of several hundred thousand lines of computer code that divide the air, land and oceans into a grid of hundreds of interacting boxes. As conditions change in one box, the changes ripple through neighboring boxes.
Until now, modelers had been forced to dice the atmosphere into a grid where each box was about 185 miles on a side. The best ocean models right now are composed of cubes about 85 miles across. The Hadley Center is creating a new model that will take the ocean resolution to cubes about 20 miles on a side, which is detailed enough to capture the important eddies that shunt heat and carbon dioxide from the atmosphere into the depths.
But Dr. Geoff Jenkins, the director of climate prediction at the center, noted that the three-dimensional nature of the problem meant that each doubling of resolution required a 16- fold increase in computing. In tests, Dr. Jenkins said, the new model "completely clogged up" one of the center's supercomputers.
Many features of the earth that are critical to climate change remain much smaller than the model boxes so must still be approximated.
Dr. Blackmon, at the National Center for Atmospheric Research, said features as important as California's Central Valley and the mountain ranges around it remained invisible.
"We can't tell you anything about what's going to happen there," he said. To do so would require a grid of boxes 19 miles on a side, he said. To achieve that detail would require computers 1,000 times as powerful as those at the research center.
Dr. Manabe said the goal of the Japanese project, the Frontier Research System for Global Change, was to use vastly greater computer power to accelerate model runs, doing more work in less time and providing a much finer-scale view of what lies ahead.
The center will have 5,120 linked high-speed processors, able to perform 40 trillion calculations per second. The most powerful computers currently used for climate modeling have about 1,000 slower processors and crunch numbers at about a hundredth of that speed.
But more brute computing power is only part of the solution.
Ronald J. Stouffer, a senior meteorologist at the fluid dynamics laboratory in Princeton, said that the key to progress was to move ahead in three realms at once: in the models, in the basic research into the processes that are mathematically represented in models and in the measurements of environmental change that will allow the testing of models.
"It's a triangle," Mr. Stouffer said. "Observations, modeling and theory. Any one can lead the other two for a while but can't lead much before you get stuck." That leads the climate scientists inevitably back from their simulated worlds to the real one.
The modelers have been lobbying for more money, not just for their work but also for ongoing measurements of change in the oceans, atmosphere, polar ice and forests. The value of this work was illustrated this spring, many say, when 50 years of ocean temperature measurements showed warming that matched the models' projections.
Other large mysteries still confront the researchers when they look earthward. Within clouds, for example, the chemistry and physics of the particles like soot and sea salt that form droplets are only slowly being revealed, scientists say.
A small change in the way droplets form could have a large impact on the climate, said Dr. Jenkins, in Britain. He said that Dr. Anthony Slingo, another scientist there, found a decade ago that in theory, a decrease or an increase in the size of water droplets of just 10 or 20 percent "could either halve or double the amount of climate change you'd get."
Eventually, laboratory work and observations should narrow that range, many climate experts say, but uncertainty will always remain.
"The best we can do," said Dr. Manabe, in Yokohama, "is to see how global climate and the environment are changing, keep comparing that with predictions, adjust the models and gradually increase our confidence. Only that will distinguish our predictions from those of fortunetellers."
Re:Inputs one of the problems (Score:1)
cross the sky...
Oh!, sh*t!! my prediction !! XDDD
You can't determine simultaneously the position and momentum of an electron (inf. prec.). And you can't determine with infinity precision the temperature at 7.5m meters above the ground at 8.00000W, 40.00000N, tomorow.Nature it's so beatiful!
Un pepe sin firma - Pepe without sign
P.D.: None of this words is typing as is, we have a lot of equations to banner to you!!, and more reasons
Re:Dirty Research (Score:1)
As a working scientist, I am almost flattered by the implication. Alas, most scientists are very ordinary in their abilities and knowledge in fields outside their expertise.
Just because one was a whiz at differential equations, doesn't make one a top-notch economist, philosopher, or a judge of human nature.
Think idiot-savant, not genius. It's a least closer to the truth.
Re:Global Warming (Score:1)
--
Hang up and drive!
Inside Info on Weather Research funding (Score:1)
Case 1: NCAR [National Centre for Atmospheric Research] in Boulder, CO is the nation's formost weather and climate research organisation. They were one of the first customers for the CRAY-1 and if anyone has ever been to the Smithsonian / Air and Space museum in Washington, D.C. you can see one of NCAR's old Cray's there. So, a couple of years ago NCAR was looking to buy a new super computer. They got bids from Cray, Toshiba and NEC. NSF, being the "Money" was then given responsibility to decide which machine they could buy. Carefully examining each proposition, the folks at NSF and NCAR eventually decided that the most bang for the buck would be from NEC [Nippon (Japanese) Electronics Corporation]. Now, you think we'd have had enough of this in the 1980s now that the Japanese economy is STILL in recession, but sure enough the folks at Cray cried foul and laid unfounded accusations that NEC was "dumping" their supercomputer below cost. Needless to say, they got congress involved and convinced one congressman to introduce a bill that would have eliminated pay for any NSF employee that okayed the purchase. HELLO! Try to save the American Tax Payer a little money and they take away your salary? If you didn't know it before, it's corporations that control your money, not congress. Corporations that want you to prop up a clearly failing business just because it is an American company and "we need to protect American Jobs". Look, I don't want my Tax Dollars wasted on Corporate Welfare, and neither should you! As an epilogue, since SGI bought Cray and SGI wanted to do business with NEC, they convinced Cray to drop the Dumping suit. Obviously the no salary bill also did not pass in Congress, thank goodness.
Case 2: North Magnetic Pole research. As any good Canadian will tell you, the North Magnetic Pole is in Canada. So, if you wanted to do research into how the Magnetic Pole effects weather paterns and such, where would be the best place to build your station? Acording to Congress Alaska. Being that any point in Alaska or anywhere else is so far from the poll that any type of research facilities would be completely useless over there, basically Congress would rather waste money than actually spend it on useful research! If you want to study the Northern Magnetic Pole, you go to Canada, because that's where the Northern Magnetic Pole is. It's a no-brainer. Yet Congress even fouled that up.
And you wonder why the U.S. is so behind Europe in the weather bis. If we ran it by the Military like most European nations do, we could probably get better results given the Military budget compared to NSF's budget, but then we'd probably not get ANY weather reports because all the research would be classified, like it is in Europe.
Devo Andare,
Jeffrey.
Re:Where have all the supercomputers gone? (Score:1)
Moreover the top 500 is based on a very unrealistic benchmarks which basically estimates peak performance - which is not relevant in practice - rather than sustained performance.
I've heard of a forthcoming revision of the top 500 criteria.
--
Re:Which brand? SGI? NEC? Cray? (Score:1)
<http://www.nec.co.jp/english/today/newsrel/000 5/3001.html> [nec.co.jp]
--
Re:Where have all the supercomputers gone? (Score:1)
http://www.top500.org/ORSC/2000/sx-5.html
Real life figures are more like 50-60% (NEC), 10-15% (IBM) and 5% (CompaQ)
--
Re:Annoying Slant (Score:1)
The last ice age went from a climate that was significantly warmer than ours today to glaciers everywhere in about a hundred years, one of your geological instants.
Did you ever wonder why Newfoundland was called Vinland by the Vikings? Anybody who thinks of growing wine grapes there today would be mercilessly laughed at, but it was a natural thing a thousand years ago. The times we live in are abnormally cold. It only makes sense that things would be getting warmer.
Bitter moaning about the tax levy (Score:1)
First of all it moaned about the 450% levy on Japanese super computers, but neglected to mention that a fair number of Meteorological or Climate super computers are Cray T3E's! (including the Hadley Centre mentioned in the article).
Next it questioned the reduction in grid size as providing an improvement in accuracy, and then spoke about reducing the grid size as being a good thing!
This article just looks like an attempt to lobby the US government. I can't imagine that 'W' is that keen on climate research!
Re:Annoying Slant (Score:1)
Re:If only you knew the complexity (Score:1)
IBM Supercomputer. (Score:1)
Wired magazine also has an article in this month's issue (so it won't be online until next month) about supercomputing used in Gene research. IBM is building a 100 million dollar machine that will acheive 1 petaflop in order to attempt to simulate protein folding faster than ever before.
It's a pretty cool article.
I don't trust the weatherman (Score:1)
Do you agree with this? I hardly trust the weather forcast for tomorrow, much less next weekend. The two sites I usually check are The Weather Channel [weather.com] and Weather Underground [wunderground.org]. They often directly contradict each other, differing by up to 15 degrees F and 50% chance of rain.
Are there any studies tracking the accuracy of various forecasters? If the weatherman predicts rain for three days from now, should I cancel my picnic or just figure that an average July day is 65-85 degrees with a 30% chance of rain? Maybe the weather here in Michigan is more variable than other regions. Do you guys actually trust the predictions in your town?
AlpineR
Re:Global Warming (Score:1)
Global Warming (Score:1)
Re:Annoying Slant (Score:1)
Re:The research is only (Score:1)
My dear AC. You are talking gibberish.
Re:Annoying Slant (Score:1)
Re:Scary graph (Vostok ice cores) (Score:1)
For instance, according to the graph, we have the highest CO2 levels ever in the past 500k years. Why isn't the temperature also the highest?
The CO2 level has been going up and down, but it went from it's lowest point to a high point in the pre-industrial age!
There is no "Normal" temperature of the Earth. Sometimes it's hot, sometimes it's cold.
So, now that we can affect our climate, what temperature do we want? Growing grapes in Greenland? Have our ice delivered to us in Chicago by iceberg? Changing society to remove the CO2 out of the air is just as reckless as pouring CO2 into it if we don't have a goal in mind.
Re:Weather and Chaos Theory (Score:1)
This is what Gleick was referring to with his famous and often misunderstood reference to the butterfly in Tokyo causing a hurricane in New York City two months from now. He was saying, rightfully, that even the flap of a butterfly's wings is enough to throw off initial conditions enough to make the weather totally unpredictable at some future point.
The bottom line is that when you don't know the exact initial conditions (unknowable ever, it seems) and you don't know the exact nonlinear equations which make up the model (which do seem to be improving) you are really just making really high-powered guesses about what is coming next. And my grandma with her bum knee can still do a better job of predicting rain than the weather man on tv.
Crash
Funny, MPI works for these guys! (Score:1)
http://www.foresightwx.com [foresightwx.com]
They list the type of IBM cluster they're using on that main page... So far they've been doing more with less, and are more accurate than other weather forecasting agencies to boot.
If you're interested in the latest techniques in weather forcasting for short and long term, check these guys out. They're based out of Boulder, CO, USA.
Interested in weather forecasting?
Amdahl's Law (Score:1)
On a more theoretical note, Amdahl's law states that if a code has any serial portions, then the speedup limit is function of the code and not of the number of processors. For instance, if 0.1% (measured in single processor execution time) of the code does not parallelize, then regardless of how many cpu's you throw at the problem, the maximum speed up compared to a single cpu is 1000. Therefore fast VPP is the way to get more performace.
Then of course, there's the human resource issue. It's a real challenge to find skilled software engineers and then to convince them to accept low-paying government salaries.
Best Supercomputers are Too Busy for Weather (Score:1)
So what is all that time allocated to the DoD spent on? And why are the nation's largest supercomputers not used for general scientific research, such as weather, genetic research, high-energy physics? Because the most important thing for America to be spending its supercomputing cycles on is the modeling of nuclear weapons. Since we're not allowed to blow up small tropical islands anymore because Japanese fishermen don't like being randomly irradiated (you'd think they'd be used to it by now, after all didn't Hiroshima and Nagasaki make the point that they'd forever be our test subjects?), we spend as much of our cycles as possible modeling those explosions in massive chunks of supercomputing time.
Maybe someday we'll be able to save the Known Universe from invasion by evil aliens because of our advanced understanding of the explosion of nuclear weapons. We'll be happy we didn't waste our valuable supercomputing cycles on something so silly as modeling climatological systems, processing astronomical data, working on cosmological models of the Universe, or dissecting the world of quantum mechanics. After all, having big guns is what makes America great!
Where have all the supercomputers gone? (Score:1)
Just because there is a website devoted to something doesn't make it canonical.
This is actually really interesting... (Score:1)
--
Re:Dirty Research (Score:1)
Bollocks. CO2 is the greenhouse gas. Its effect isn't as strong as some others, but it makes up for it by being present (and being produced) in vastly bigger quantities.
Re:But do you know of any (Score:1)
Re:But do you know of any (Score:1)
Re:Distributed modelling (Score:1)
Re:Distributed modelling (Score:1)
Re:Inputs one of the problems (Score:1)
Very roughly, that means that you lose one bit of precision per iteration. Therefore to produce accurate long-term weather forecasts you will need observation made with hundreds of accurete digits after the decimal point, which is quite absurd.
Re:Weather research 101 for George W (Score:1)
Did you know CO2 has been building up in the atmosphere for the past 150,000 years? And we're still far below the level of atmospheric CO2 that was present in past signifigant warm events. From ice core studies, the evidence points to warming occurring BEFORE CO2 rises, not AFTER. The planet is warming, big deal. It was warmer in 1100AD than it is today. That corresponds, by the way, to an explosion of agriculture worldwide. Including in the Southwest US. Global warming is a natural process. Get over it. We'd have to cut CO2 emmissions to 18th century levels to even DENT the computer models. And for what? Questionable conclusions that humans are warming the planet? No thanks.
Derek
Re:Annoying Slant (Score:1)
Here's just one of many who are raising real questions.
http://www.osu.edu/researchnews/archive/nowarm.
Derek
Better link! (Score:1)
http://archives.nytimes.com/2001/07/03/science/03
Re:Global Warming (Score:1)
Hell, I'm in England and we've had nearly a fortnight (two weeks) of sunshine. I can only conclude that, yes, it is the beginning of the end.
Re:Supercomputer Envy (Score:1)
Oxford is in the U.K. Our last president went there to study. Stanford is in California, which may or may not be a foreign country.
Check your nationalism at the door when you talk about science. How many of the "American professors" other countries are "beg"ging for are of non-native extraction? American postwar science was built on the knowledge of refugee scientists. Werhner von Braun? Enrico Fermi? Edward Teller? von Neumann?
Re:Weather and Chaos Theory (Score:1)
Re:Weather research 101 for George W (Score:1)
I suddenly find myself wondering what kind of mileage the Pope-mobile(tm) gets...
Too-Big Matrices (Score:1)
I wrote a reasearch paper in college for about Global Warming. One of the major problems is that all the models were using these 200 square mile matrices. So it is either raining or not raining in a 200 square mile area.
No matter what else you do, you have an innacurate model at that point. Rain simply does not occur in 200 mile squares. It happens in small, localized patches much of the time. Thus, it is really not kown how rain would be affected by, or how it would affect, global warming.
Re:Supercomputer Envy (Score:1)
The IBM SP's seem like the weird hybrid...with multiple CPU's per "node".
What made Cray money was their compilers. The auto-vectorizing capability made serial code run fast with no work. There is no "auto-Beowulfing" compiler yet. For work in that area, look for "co-array fortran", or UPC "unified parallel C".
But since compilers are black boxes, that I inherently don't trust because I don't understand, if you want to get work done, the code should be handwritten using MPI and OpenMP. Of course, your mileage may vary.
And the question for the day: which way is the net transport of CO2 at the atmosphere/ocean interface? Is it into, or out of? Last I knew, we didn't even know that!
Re:Greenland (Score:1)
No it wasn't. This was an early example of false advertising, aimed at persuading more Vikings to settle there.
Weather research 101 for George W (Score:1)
For those who don't know (Score:1)
Login: cyberphunk
Password: cyberphunk
Re:If only you knew the complexity (Score:1)
I think Computational Science has suffered a great deal in the 1990's because of all the smart computer scientists and programmers hitching on to the dot-com bandwagon, and opting for all that showy stuff (which was bound to flop) instead of getting deeper into good solid research such as reducing parallel programming complexity.
charmer
Re:On the lack of processing power... (Score:1)
The problem with weather/climate modeling is that, for each time step, a huge amount of data is needed to compute everything and that local changes end up not being as local as you might think. An action in one part of the atmosphere can affect the atmosphere elsewhere. In other words, the atmosphere is an internally coupled system. You can't pull it apart spatially, at least in any sort of reliable fashion. So, if you wanted to distribute the computing, you would have to give each client a buttload of previous data for it to do it's calculations, on the order of tens to hundreds of megabytes. Then the computing power required to divide up this data, retrieve it, and place it in its proper position would be comparable to the original task's power.
Essentially, the problem is too interconnected and complex for distributed computing, at this point, to be useful. It's not easily pulled apart into simple, discrete computations. These models are some of the most complex algorithms out there currently.
-Jellisky
Re:Distributed modelling: Ahem (Score:1)
Re:Funny, MPI works for these guys! (Score:1)
Re:Dirty Research (Score:1)
Bollocks.
Cheers!
Chaos (Score:1)
Same response... (Score:1)
Why not make this a distributed computing task, similar to Seti@home? Just a thought...
Re:Supercomputer Envy (Score:1)
Don't be so sure that that's for environmental work. Think undersea warfare..
Re:The research is only (Score:1)
Good models? (Score:2)
Do we? Those models aren't based on the greenhouse effect of CO2 (which is easy to predict, and which apparantly can't raise world temperatures by more than a fraction of a degree). They're based on an assumed positive feedback from the greenhouse effect of water vapor: the idea being that that fraction of a degree rise from CO2 will increase evaporation from the oceans, which will put more water vapor in the air to cause a *real* greenhouse effect.
Needless to say, this is a hard theory to quantify; hence the need for all the supercomputers.
but not the other way round,
And this is just wrong. Ever opened up a hot can of soda?
There are currently [colorado.edu] 720 billion tons of carbon in the air. Sound like a lot? It's nothing compared to the 39 trillion tons of carbon in the oceans. And when you heat up the oceans, what happens? Same thing as in your soda: the solubility of carbon dioxide changes, and CO2 is released into the air. If the temperature goes up, the ocean releases CO2. If the temperature goes down, the ocean absorbs it.
So which is happening? Do temperature changes cause CO2 level changes, or vice versa? Hell if I know. I used to believe the latter, now I'm leaning toward the former. The most convincing piece of evidence I've seen is a paper (published and presumably peer-reviewed by Science, although it's been quoted by quite a few more biased sources since then). Atmospheric CO2 Concentrations over the last Glacial Termination [unibe.ch] has another graph showing those scary CO2/temperature correlations over about 12000 years... but with some less scary conclusions. It seems in their ice cores, CO2 changes lagged temperature changes by 800+/-600 years.
Re:Weather research 101 for George W (Score:2)
Re:Weather and Chaos Theory (Score:2)
I suspect they know a hell of a lot more about nonlinear dynamics (including "chaos theory") than you do. For instance, turbulent flows are generally chaotic, making it impossible to predict the path of a given particle for more than a brief amount of time. On the other hand, the resultant mixing can be easy to study statistically on a larger scale, and it is entirely possible that the longer the model runs the more accurate the aggregate results. Stated differently, it is entirely possible (even likely) that a system that is divergent when modeled on a small scale is convergent when modeled on a large scale.
Climatological modelers were among the first to realize the implications of Lorentz's results. They, and most other scientists in fields involving nonlinear dynamics, have spent the last three decades digesting and incorporating both the results spawned by his discovery, and earlier "forgotten" results like those of Poincaré. That's not to say all climatologists have sufficient understanding of chaos, but the community as a whole has long been aware of it.
As to the politics of the situation: just follow the money -- as usual.
Re:Weather and Chaos Theory (Score:2)
So yes, the effort is multidisciplinary and - at least in Switzerland - climatologists are well aware of the implications (many of them are physicists).
See <http://www.climate.unibe.ch> [unibe.ch] (I hope that NCCR/Climate gets a real website up and running soon) and <http://www.cscs.ch> [www.cscs.ch]
--
"Greenhouse Theory" lacks evidence (Score:2)
The above statements are false. As someone who works for a weather company and hears a lot of theories, it still very much in question whether manmade greenhouse gasses are causing global warming.
Our current evidence suggests that increased surface temperatures are more likely caused by increased development (ie, asphalt) nearby ground measurement stations. Also, cyclical sunspot activity results in a curve of zipper-pattern fluctations in the radiated energy that reaches the earth over ~11 year period. The above two observations *do* have substantial physical evidence to back them up, and are a better explanation for recent global warming than the greenhouse theory.
Huh? (Score:2)
That won't stop a
--
not all forecasts are equal (Score:2)
What happens is you find the vast bulk of the model output is essentially the same. The variability is in the exact location of fronts, exactly the type of stuff that has always been difficult to predict.
Given a long enough time frame, everyone will fall under this uncertainty. So you still can't make long-term forecasts, but you *can* give decent 7-10 day forecasts if you have the flexibility to occasionally say that it's impossible to forecast the weather on some of those days. In the vast majority of cases that's good enough - it allows people to avoid scheduling activities when the weather is likely to be nasty.
Re:Annoying Slant (Score:2)
You must be a very well informed and regarded climatologists to be so absolutely sure of your facts like this I look forward to reading your articles and resume.
Thanks.
I can top that. (Score:2)
Re:Weather research 101 for George W (Score:2)
Idiot.
Re:Annoying Slant (Score:2)
Care to present any of this evidence? I haven't heard of any natural processes which would significantly raise the amounts of greenhouse gases in the atmosphere over the last 50 years.
BTW, I though the debate was mainly about how the climate will change and how screwed we will be when it does. What caused it may make interesting research, but we aren't going to be able to stop it, whether it's anthropogenic or not.
Re:Annoying Slant (Score:2)
And the National Academy of Sciences (US) report on global warming, drafted by the senior US scientists in the field, is wrong. As is every other major collaborative meta-analysis of existing evidence.
Re:Annoying Slant (Score:2)
The UN didn't FUND the scientific research, only the report.
There actually is very little debate even on this topic anymore, either in the scientific community or in the VAST majority of world leaders (Dubya being an exception). The northern hemisphere is warming. Human generated gases are a principal cause. Even with climatic fluctuations, the changes in the last 50 years are ring clear as a bell. The situation is similar to that about 10 years ago, when everyone except the tobacco producers were claiming that nicotine was addictive. Well, guess what - it is the most addictive substance on the planet.
No one stands to make a tenth the money from global warming as the energy companies stand to make from selling energy. The bottom line is that there is really no way to get out of it without using clean energy or less energy. Clean energy costs more, and less energy stunts growth. Either way there are economic consequences.
But you have to think about the consequences of failing to act quickly enough. New Orleans is under sea level. So is Amsterdam. Virtually all the US east coast cities south of DC will be under water - places like Virginia Beach, Charleston, Savannah, Miami. And that pales in comparison to what will occur in low lying areas in Asia. (Note: so far Antarctic ice is unaffected. If it continues to be unaffected there will be no oceanic rise of note)
Besides that - entire ecosystems in the northern hemisphere are shifting climate so rapidly that native species do not have time to migrate further north. The amount of warming in the last 50 years can be translated to a shift in latitude. And in some areas, notably Alaska and Siberia, these shifts are killing native species.
I think if you do your research a little and read the works of the scientists who study climatic change, you would have very little doubt indeed about whether there is northern hemispheric warming or whether human generated gases are at least substantially to blame.
Re:"Greenhouse Theory" lacks evidence (Score:2)
There are many different ways to come to conclusions regarding things like global warming. That is why Bush asked the National Academy of Science to create a report. In this, the NAS gives academic independence to its members who work in the field of climate change. The committee was made up of 11 of the nation's top climate scientists, including seven members of the National Academy of Sciences, one of whom is a Nobel Prize winner. You can note that they do not support the stance of Bush that the evidence needs to be further evaluated before taking substantial action, which indicates that the source of funding for the report is not biasing the results.
http://www4.nationalacademies.org/onpi/webextra
They note that greenhouse gases are increasing. CO2 is mostly to blame. It is mostly human generated. One of the most compelling pieces of evidence is the cooling of the stratosphere. Urban warming lacks adequate explanatory power. But don't believe me - read the report.
Note that this is one way of forming an argument - relying on the consensus opinions of experts in the field. You could similarly rely on the opinion of a rogue in the field that others do not agree with. Sometimes the loner is right, most often the consensus is right.
Any grassy knoll believers ?
Re:This is scary (Score:2)
Does the Heisenburg (?) uncertainty principle apply at a macroscopic level ?
If we observe the weather, can we do so without modifying it ?
fwiw, i've always wanted to somehow work that into my defense whilst protesting a speeding ticket..
"boy, i clocked you doing 77mph back there"
"do you know which way i was going ?"
"yes"
"then you couldn't possibly have known how fast I was going"
Re:Inputs one of the problems (Score:2)
Re:semantic but important difference (Score:2)
Re:Annoying Slant (Score:2)
The current climate models are not even predictive - they can not even recreate past climate events, let alone predict future ones.
As to the report put in front of G. Bush, its an even bigger fraud. The scientists who signed the report DID NOT WRITE IT. It was written by functionaries. Several of the signees were global warming sceptics, who do not support anthropogenic climate change hypotheses.
Derek
Leave it to NEC and Cray... (Score:2)
Which brand? SGI? NEC? Cray? (Score:2)
http://archives.nytimes.com/2001/07/03/science/
I can see that they're standing beside many racks of SGI Origin 3000 gear (whether it's a single machine or many smaller machines depends on the cabled configuration -- O3K uses a mesh of cables rather than a backplane for its third-generation ccNUMA architecture).
At any rate, I'm curious as to who company they'll be buying their new system from. The only clue I can gather is their mention of 5120 CPUs to churn out 40 TFLOPS... that's 7.8 GFLOP per CPU which pretty much rules out SGI or even the latest Pentium/Xeon, Itanium, UltraSPARC III, or Alpha (unless one uses a very Apple-esq SIMD benchmark but I would imagine they want and need something more flexible for true performance with SISD, SIMD, and MIMD combined, rather than just bragging rights for a high single benchmark and top spot on Top500). My guess is they're going with either the not-yet-released Cray SV2 (which will combine the parallel strengths of the T3E with the vector strenghts of the SV1ex) or NEC's successor to the SX-5 (SX-5 is marketed by Cray in the USA, but under its creator, NEC, in other countries).
Anyone have additional insight?
Re:What a crock. (Score:2)
Three words, buddy: satellite remote sensing. You're right about one thing - we didn't have that technology back in the '50s. But starting with the TIROS satellites, that all changed. Currently, there exists a plethora of satellite sensing systems which not only measure temperature, but moisture content, cloud cover, and a whole slew of other relevant parameters, all with reasonable accuracy.
One other thing that bugs the crap out of me about "global warming". NOBODY EVER TALKS ABOUT CONCRETE, STEEL, AND ASPHALT!!! Hasn't ANYBODY ever noticed how hot a street or roof gets in the sun?!? I expect a few temperature measurements in growing cities would be more than enough to throw off their temperature measurements.
Try this on for size:
http://directory.google.com/Top/Science/Earth_Sci
Just because you're not listening doesn't mean we're not talking about it.
Greenland was green when it got its name, folks.
Actually, Leif Eriksson named the island "Greenland" because he thought people would want to move there if the island had an attractive name. Read "Greenlander's Saga" for more information.
Concidering that planktin, not rain-forests as the greenies would like you to think, fix something like 70-80% of the CO2 in the atmosphere, it would appear that the earth is more than capable of absorbing whatever increase in CO2 we're providing.
Care to post a reference? IANABiologist, but I would assume that water-borne plankton would absorb CO2 from the ocean - so the ocean (not the plankton) would have to provide the additional uptake of CO2 for the plankton to make a difference. But I could be wrong - I'm not really good with fish.
There are two things to consider when talking about the climate: one, that the climate has definitely warmed in the recent years. Two, we have yet to figure out why. Certainly climate models show the impact of CO2, but they're designed to do so. Other climate models use different, more natural phenomenon to produce the same warming. Essentially, if you want to say that "X causes global warming" it's possible to create a model that will indeed show that X causes global warming. Climate modelling is just too complicated to come up with a definitive answer. But, given the possible outcome of anthropogenic climate change, do we simply ignore the problem as a Chicken Little scenario, as you tritely suggest?
Buy my book! I just lost a fortune in tech stocks and I need money!
Hehehe...did you invest all your money in DrKoop.com?
IBM's Deep Thunder (Score:2)
Re:Andrew Sullivan (Score:2)
I think the moderators have demonstrated quite effectively the reality that modern weather/climate science is almost completely driven by the environmental extremists whose chief research tactic is to shout -- no scream -- down anybody who discovers something that doesn't fit into their neat little global warming agenda.
I'll start to believe the possibility of significant human contributions to global warming and the so-called greenhouse effect when I see intelligent discussions of viewpoints that disagree with the current wacko political agenda.
The first question I'd like to see addressed is how exactly the earth emerged from the last ice age without the assistance of the internal combustion engine.I was going to post anonymously, but then I got a grip and realized that all I'm going to suffer is the loss of a couple karma points on /. Like I could care.
The best diplomat I know is a fully activated phaser bank.
Re:You realize of course (Score:2)
The radar pulse interacts with the molecules of your car, imparting a small degree of energy coming in from a certain vector (like a gentle breeze) which does change the direction of the car but to such a small degree that it is totally undetectable, and only of mathematical curiosity.
-Nano.
Re:Annoying Slant (Score:2)
IMHO, there is already a scientific consensus on both of the things you mention. An overwhelming majority of the climate scientists is convinced that we are responsible for the climate change. I believe that those few who disagree are industry lapdogs.
What we don't have is a political consensus on reducing the emissions. Scientists are not listened, especially when the right thing to do would be expensive and unpleasant. (I think GW Bush has not nominated a science advisor yet. If he has, please inform me.)
Re:Global Warming - Mod me up please (Score:2)
This has been due to high pressure systems from up North coming down South.
The other interesting thing is that insurance companies are now very wary of insuring many areas susceptible to tropical storms. FLorida actually had to force insurance companies to insure people who insist on building in dangerous areas. In Australia, there are now areas where insurance rates have gone up hundreds of %, and some areas just cannot get insurance.
In fact, Hurricane Andrew drained several billion dollars from Australian re-insurers and sent then broke, Not again!
But do you know of any (Score:2)
Re:But do you know of any (Score:2)
Re:Annoying Slant (Score:2)
It's warming right here in the USA. I don't need a scientist to tell me this; when I step outside the front door it feels hotter over the last 10 years.
10 years is not a geological timescale. On a true geological timescale, a century looks like an instant.
It's likely that 100 million years from now, the effects of the human race plotted against time are going to look indistinguishable from the effects of the comet that wiped out the dinasaurs. Given human nature, I doubt that there's even anything that can be done to avoid that.
At the very least, we all get to witness one of the biggest events in the earth's history, like being a passenger in a huge slow-motion train wreck.
Re:If only you knew the complexity (Score:3)
While zavyman points out the basic problems inherent in parallelizing any discretized numerical model, the problem in obtaining good performance on hybrid architectures like the IBM SP-2s and SGI Origins which currently top out the top 500 list goes much deeper.
First, these machines are built around a hybrid architecture. Each node has a few processors (typically between 4 and 16, depending on the model), which utilize shared memory. These nodes connect to one another via an internode interconnect, with relatively modest bandwidth.
While this hybrid architecture allows supercomputer manufacturers like IBM and SGI to scale into the thousands of processors, it also introduces a substantial complexity into building of high-performance codes. Ideally, one would like to run threads-based parallelization on each node, and MPI between nodes, though the reality is that most codes in use use only MPI.
One can get decent scalability (into the hundreds of processors) when one runs physical models with limited communication -- ie, which simulate hyperbolic PDES like those of hydrodynamics (as zavyman describes above). However, things become more interesting when one considers more varied physics, such as that involved in solving elliptic PDEs (such as Poisson's equation for self-gravity or electrostatics). Because elliptic equations connect everything with everything else on the spatial domain, the communication costs ARE MUCH HIGHER. It is extremely challenging to build a multiphysics code with such varied parallelization demands. Indeed, it is a fair statement that no one has yet achieved excellent performance on anything close fo the thousands of processors available on these hybrid machines. For instance, another poster describes a climate model available from another research group. However, if you dig deeper, you find that they state,
"ForesightWX uses an IBM 12-node system with 52 processors working 24 hours a day. The cluster fits snuggly in a small room. A decade ago the same power would have filled the building."
52 processors is a far cry from the thousands of processors available to the users of these machines. Since each processor is slower than a vector processor like the Cray (by about a factor of 3 - 5), and assuming ideal speedup, such modest levels of parallelization lead to speedups of about 10-15 relative to a single Cray T90 processor. It is quite evident that there is little net gain over running the same simulation on 8-16 T90 nodes.
Moreover, due to the hardware constraints described above, IT MAY VERY WELL BE THE CASE WE NEVER SEE EXCELLENT MULTIPHYSICS PERFORMANCE ON THEM.
(One can get better parallel performance by increasing the problem size, but as the article points out, doubling the resolution of a simulation increases the cost by a factor of 16; hence, simply increasing the problem size may lead to unacceptably long computation times.)
I think massively parallel architectures will ultmately be the wave of the future, but there is little getting around the fact that the current generation of IBM-SP2s are dogs in the performance category.
Bob
Re:Distributed modelling (Score:3)
Inputs one of the problems (Score:3)
So, there are HUGE holes in the data. Makes it hard to make a model
Scary graph (Vostok ice cores) (Score:3)
This graph [economist.com] is one of the scariest things I have seen in a long time. It's a plot of the temperature variations and CO2 levels over the last 500,000 years measured from ice cores drilled out from Lake Vostok in the Antarctic. The two series track each other incredibly closely.
As we now have good models for why CO2 should cause temperature change, but not the other way round, it is something to take very seriously.
The figure was taken from The Economist [economist.com] magazine, a paper not usually associated with extreme anti-business views. Two recent articles gave good summaries of our present state of knowledge about global warming, and how both the data and the models have improved over the last ten years:
Article on the report for the Bush administration from the National Academy of Sciences.
Background piece before the last 'Kyoto' meeting in the Hague
One worrying new possibility is that there may be an abrupt change (bifurcation) in the ecosystem response as the temperature rises. At the moment about 50% of the manmade CO2 emissions are being absorbed by the Amazon rain forest. But the latest Hadley Centre models predict that if the temperature continues to rise, this greatly increases the frequency of much drier weather in this region, causing the forest to dry out, ultimately leading to uncontrollable forest fires. This would release vast amounts of more CO2 into the atmosphere if the whole lot went up -- perhaps ten times as much as human activities.
(And that is not the ultimate nightmare positive-feedback scenario, which is the enormous amounts of methane hydrate locked up at the bottom of the ocean in the arctic permafrost. The only thing that keeps it stable is the high pressure and low temperature. There is thought to have been a runaway destabilisation 55 million years ago, which raised the temperature 15 degrees C in less than 20 years).
I suppose somebody might come up with a techno-fix solution. But the complacency of gambling on that is like playing Russian roulette with five of the six chambers loaded.
Re:Annoying Slant (Score:3)
I dount that the situation has changed remarkably since then. One thing that I'm sure hasn't changed is that there is no shortage of really solid data to support both sides: that the temperature really has risen, and that it really hasn't. There are thousands of temperature time series, some direct and some inferred, some are climbing, some are falling, and most aren't changing significantly after controlling for all the relevant sources of variance.
Globally it is likely that the 1990s was the warmest decade and 1998 the warmest year recorded (since 1861). Certainly this seems to be the case in the northern hemisphere not simply since 1861 but for the last ten centuries.
Yep, I hope so. We are still coming out of a little ice age, returning to the higher temperatures which were the norm when the Vikings grew grapes in Newfoundland. The scary thought is that we might find out, in 100 years, that the temperatures are really going down.
You point out that the EPA and UN-funded scientists have found evidence of global warming. Notice where their funding comes from. If Exxon was paying the bill, these same guys would no doubt have found the opposite. Government and industry researchers don't get tenure.
There are literally thousands [sepp.org]of responsible scientists who work in these fields who believe that any sort of costly action to "avert global warming" is a bad, irresponsible idea. Some of them are Exxon employees, but certainly not all. Here [junkscience.com] and here (loosely related) [mit.edu] are a couple of random links which might help make the point that it isn't a settled issue in the minds of people who understand it and aren't funded by the Government or Greenpeace (HINT: both these groups find it easier to get money from the public if they can claim that the sky is falling.)
In short, ad homenim arguments are less productive than usual here, since we see the usual suspects on each side of the issue. The energy companies are pushing their issue, Greenpeace is pushing theirs, and so on.
We need to consider the consequences of being wrong. Seeing the global temperature rise by 1 to 2 degrees C is probably going to make the world a better place to live in the long run. That's the maximum likelihood prediction from most of the models that folks on either side take seriously. The doomsday 5+degree C senarios have very low probabilities under most models.
Consider the cost of "taking action": Millions of people around the world, most of them already desperately poor, will die earlier and more miserably if we do anything to limit energy use. The only thing I can think of to reduce greenhouse gasses without causing disaster is replacing coal with nuclear power. That isn't going to happen anytime soon, unfortunately, because of the same agenda that is driving the "its getting hotter" side of the issue.
Supercomputer Envy (Score:3)
I hate it when the press makes it sound like America is the jack-ass backwards donkey of the supercomputing world. This writer implies the Japanese and Europeans have vastly superior computing power. This is clearly the notion of a chucklehead. Take a look at The Top 500 [top500.org]. By its (Linpack) metric, 8 of the top 10 machines are in America. Three of them are DEDICATED to weather or environmental work (Naval Oceanographic Office, National Centers for Environmental Prediction). A fourth one at NERSC is relatively open, compared to defense machines, and I'd be willing to bet weather code is running on it regularly. These are all teraflops machines. Japan has the other two in the top 10. Anybody know the job mix on those two? Europe's fastest machine is the Hitachi in Muenchen. The fastest dedicated European weather machiens are the T3Es at the Deutscher Wetterdienst and at the UK Meteorological Office.
I don't buy these whiny weathermen's complaints. The difference is that the American machines are all massively parallel machines (mostly IBM SP). The Japanese manufacturers all make vector machines, some of which the Europeans use. The Cray T3E is kind of a weird in-between architecture. It takes a good programmer to use a MPP to its full capability. The vector users, on the other hand, have 30 years of old code and practice which keeps them in the game. If the Americans would suck it up and learn to use their amazingly fast IBMs we would hear whining from the other side of both ponds. If you try to run your old code for the Cray C90 on an IBM SP, you are going to get terrible performance. If you rewrite the code, you may get great performance. But these guys aren't rewriting the code. Take for example the machines at NCEP [noaa.gov]. These create the daily production weather models used all over the US. They are IBMs which replaced a Cray that self-immolated about 1.5 years ago. When they brought the new machines up, I wonder if they rewrote the code beyond making it run? If you know, enlighten us!
What a crock. (Score:3)
One other thing that bugs the crap out of me about "global warming". NOBODY EVER TALKS ABOUT CONCRETE, STEEL, AND ASPHALT!!! Hasn't ANYBODY ever noticed how hot a street or roof gets in the sun?!? I expect a few temperature measurements in growing cities would be more than enough to throw off their temperature measurements.
Then, there's the well-ignored fact that we're coming out of a mini ice-age, which peaked circa 1850. Greenland was green when it got its name, folks. The earth got colder since then and is warming back up, completely without our assistance.
And another thing... I saw just the other day that one of NASA's earth-monitoring has recorded a 30% increase in the levels of planktin in the oceans over the last 10 years. That's not a prediction, folks, that's a direct measurement. Concidering that planktin, not rain-forests as the greenies would like you to think, fix something like 70-80% of the CO2 in the atmosphere, it would appear that the earth is more than capable of absorbing whatever increase in CO2 we're providing.
Really, these global warming people sound about as rediculous as the Y2K people. The sky is falling! The sky is falling! Buy my book! I just lost a fortune in tech stocks and I need money!
Distributed modelling (Score:3)
Annoying Slant (Score:4)
The article notes the objection of global warming skeptics as if there is scientific consensus that a) the build-up of so-called "greenhouses gases" causes the Greenhouse Effect (probably true) and b) that an increase in the concentration of greenhouse gases is anthropogenic (probably false):
Yes, the Earth is warming in some areas, e.g. Siberia. But, this is totally expected if you look on a geological timescale, vis-a-vis the Ice Age cycle. The debate is centered on whether or not man or natural processes (cycles of flora and fauna, volcanoes) are driving the current trend. I have not seen any convincing evidence to support the existence of anthropogenic phenomenon, and plenty to support the existence of natural phenomenon.
*** Proven iconoclast, aspiring epicurean ***
Re:Inputs one of the problems (Score:4)
Within the next, say, 100 years (prolly a lot sooner than that), we'll have the ability to release millions, even billions of nano-probes into the atmosphere and oceans (c.f. Stephenson's The Diamond Age). The air-borne probes can measure temperature, windspeed, and humidity. The water-borne probes can measure water temperature, currents, evaporation.
Now imagine all these probes sending their observations back (in real time, perhaps using each other as repeaters to carry the signal) to a centralized data storage and analysis facility.
Now imagine a massively parallel computer running simulations based on these observations... As another poster observed, there are bound to be limitations on any system that doesn't have perfect observations at infinitely fine granularity. Whatever those limitations are, I suspect we are not too far from finding out what they are.
</wag-about-the-future>
(for those who are wondering, "wag" is a technical term used in estimating -- it stands for Wild-@$$ Guess)
--
This is scary (Score:4)
Re:Supercomputer Envy (Score:5)
If only you knew the complexity (Score:5)
If the Americans would suck it up and learn to use their amazingly fast IBMs we would hear whining from the other side of both ponds.
Great, what the hell do you think we are doing over at Argonne National Labs [anl.gov]? I mean, have you tried to paralellize an atmospheric climate modeler? I don't think so. Coding for a vector-based machine is pretty straight forward. You concentrate on once machine as if it had one processor and one bank of memory, and code away, occasionally noting if your loops are not easily vectorized. The compiler magically does the rest, and your program runs really fast. That's why the Japanese machines are nice.
On the other hand, imagine designing an atmospheric climate modeler on a large cluster. The current paradigm being used and developed is MPI [anl.gov] Let's see what you have to worry about. One, since the processors are not sharing memory, messages have to be passed between them to share memory. No biggie. But now consider that the whole atmosphere has to be broken up into pieces of a grid. On the boundaries, grid points must be shared by two or more processors. At each timestep, those points must be synchronized. Code must exist to know the processors that border one another and that know which points to share.
Now what happens if you want to combine different models together, as in the atmosphere, ocean, land, and sea-ice models. This is known as a climate coupler. Well, now you have differing grids for each of the models because they were developed independently. Now your program must handle interpolation of the grid points and must again know which processors border one another so that data is efficiently transfered. Finally, there must be decent load handling so that each processor is doing its fair share of the work.
I'm now working on the climate coupler project here at Argonne. Vector machines are quite easy to program, but we do know that they will lose out to massively parallelized clusters. It's just that the programming is much more difficult, since the messaging is always the bottle neck. Communications development follows a rate similar Moore's Law, but with a longer doubling time. For maximum efficiency, the programmer must handle the messaging model directly.
The modeling group [anl.gov] here at Argonne understands the issue, and we are working on a general climate system to run quickly on parallized machines. No you know why you can't just do a simple code rewrite. You need to redesign the whole system.
Accelerated Climate Prediction Initiative. [anl.gov]
Re:Annoying Slant (Score:5)
There have been about a dozen articles published in Science in the last year in which model after model of climate has been tested. Time after time the models have converged on one and only one solution: increases in greenhouse gases are responsible, and the increases parallel those produced by man. The jury is in. The decision is done. The only issue left is whether mankind can do anything about it, and whether we can live with it.
Seriously, see the EPAs opinion http://www.epa.gov/globalwarming/climate/index.ht
Also see the scientists commissioned by the UN to look into the problem - they also concur you are wrong
http://www.ipcc.ch/
TO paraphrase as at http://www.uic.com.au/nip24.htm
* Over the 20th century the global average surface temperature has increased by about 0.6 degrees C, more than earlier estimated to 1994. This appears to be the largest increase in any of the last ten centuries.
* Globally it is likely that the 1990s was the warmest decade and 1998 the warmest year recorded (since 1861). Certainly this seems to be the case in the northern hemisphere not simply since 1861 but for the last ten centuries.
* On average, between 1950 and 1993, night time daily minimum air temperatures over land increased by about 0.2 degrees C per decade, lengthening the freeze-free period in many mid to high latitudes.
* Since the 1950s the lower part of the atmosphere has warmed at about 0.1 degrees C per decade, as snow and ice cover have decreased in extent by about 10%, and Arctic sea ice thickness more than this.
* However, some important aspects of climate appear not to have changed, including storm frequency and intensity and the extent of Antarctic sea ice.
semantic but important difference (Score:5)
as it applies to modeling the weather --- that is to say, modeling the planet
The article's a little misleading. It starts with a discussion of the weather, then moves on to discuss modelling of the climate. It's basically impossible to predict the weather -- meaning the exact temperature, rainfall, cloud cover, etc. -- more than a week in advance, because you have to specify the model with essentially infinite precision or chaotic effects take over. In fact weather prediction was one of the earliest manifestations of chaos theory. The climate -- meaning long term averages -- can exhibit stable behaviour that is possible to model in the long term. Don't look for this technology to dramatically improve weather prediction.