Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Science Technology

Supercomputing and Climate Research 117

Mr. Obvious writes: "It must have already been submitted, since the article is over a day old (gasp!) but there's a good round-up on the state of the art in supercomputing, as it applies to modeling the weather --- that is to say, modeling the planet --- over at the NYTimes. They go into lots of interesting things concerning how hard it is, what progress has been made lately, why the US researchers feel themselves to be hamstringed in comparison to those in Europe or Japan, and even into some things you probably didn't know (I didn't, at least) about the weather."
This discussion has been archived. No new comments can be posted.

Supercomputing and Climate Research

Comments Filter:
  • by Anonymous Coward
    Many of the models already use an adaptive time step.

    Also, I doubt that an adaptive grid would be necessary. Most of the medium scale effects (smaller than the model grid, but larger than say, convective storms) are related to terrain, proximity to water, etc. They're static, so you could handle these with just a fixed variable density grid rather than an adaptive one. Don't ask me why the models don't use a variable density grid already. I wonder about that myself.
  • by Anonymous Coward
    Computer's like IBM's SP are good at running Linpack benchmarks, but they are less capable at running climate simulations.

    That's not true. Discretizing and solving PDEs using finite differencing on a regular grid boils down to little more than solving a linear algebra problem iteratively. And that's exactly the sort of thing LINPACK does. Besides, vector computer architectures are specifically designed for doing fast matrix-vector operations, which happen to be at the root of both LINPACK and weather and climate prediction. So if a distributed, message passing architecture like those on the top of the list can run LINPACK faster than a vector architecture, than it can run weather & climate models faster too, assuming they are coded to take advantage.

    It is partly a matter of programming, but it is also a limitation of the SP's high interconnect latency and low interconnect bandwidth.

    Yes, those are important issues, but the LINPACK results prove that they can be overcome. IMO, the problem is entirely a matter of programming. I have personally endured the pain of working with code written by the guys in Boulder, Princeton, and DC. And, unfortunately, it has been the sorriest code I've ever seen from professional programmers. These people can't even be bothered to learn basic programming practices, let alone modern parallel architectures.

    So, I certainly agree that it is much easier to program for a shared memory vector computer. But that's a poor excuse. The people in the weapons simulation business and the aerospace business have learned how to take advantage of other architectures. It's about time the climatologists learned how to do it too.

  • by Anonymous Coward
    Why not have a weather@home client to run on your computers. If we are going to search for intelligent life out there with seti@home, maybe we could tell them what the weather will be like when they come with weather@home.
  • by Anonymous Coward
    [none of the cy/ipherpunk/writecodes work, so for those of you who don't want to add another 90year old billionaire granny in Antarctice to their register list, or allow even one of those NYT cookies into your recipe file (if even for a single session), here's the article:]

    Climate Research: The Devil Is in the Details

    By ANDREW C. REVKIN

    Multimedia: http://www.nytimes.com/images/2001/07/03/science/s ci_CLIMATE_010703_000.html

    n 1922, Dr. Lewis Fry Richardson, a British physicist with a penchant for grand ideas, described how to forecast the behavior of the atmosphere.

    He had details wrong but the basic concept right: a suite of equations that, when applied to measurements of heat, cloudiness, humidity and the like, could project how those factors would change over time.

    There was one grand problem. To predict weather 24 hours in advance, he said, 64,000 people with adding machines would have to work nonstop for 24 hours.

    Dr. Richardson pined for a day "in the dim future" when it might be possible to calculate conditions faster than they evolved.

    That dim future is now. But while much has changed, much remains the same.

    Supercomputers have answered Dr. Richardson's plea. Weeklong weather forecasts are generally reliable. But long-term climate predictions are still limited by the range of processes that affect the earth's atmosphere, from the chemistry of the microscopic particles that form cloud droplets to the decades-long stirrings of the seas.

    With its oceans, shifting clouds, volcanoes and human emissions of heat-trapping gases and sun-blocking haze, earth remains a puzzle, said Dr. Michael E. Schlesinger, who directs climate research at the University of Illinois at Urbana- Champaign.

    "If you were going to pick a planet to model, this is the last planet you would choose," he said.

    So even as the evidence grows that earth's climate is warming and that people are responsible for at least part of the change, the toughness of the modeling problem is often cited by those who oppose international action to cut the emissions of heat-trapping gases.

    And while American research centers once dominated this effort, they have recently fallen behind others overseas.

    By many accounts, the dominant research effort is now at the Hadley Center for Climate Prediction and Research, 30 miles west of London. More than 100 scientists there are using extremely powerful computers just to explore long-term questions. Several recent studies by the National Academy of Sciences found that other countries had provided superior supercomputers for advanced climate research.

    The academy found that efforts in the United States were hurt in the 1990's by a Commerce Department tariff of 450 percent on Japanese supercomputers. The tariff was lifted this spring.

    The results are vexing for American scientists, said Dr. Maurice Blackmon, director of climate studies at the National Center for Atmospheric Research in Boulder, Colo.

    Last week, Dr. Blackmon said in an interview, he met a climatologist from a Swiss university who was preparing to run a copy of the Boulder laboratory's most sophisticated model on a supercomputer in Bern "six to eight times faster than we can here."

    "That's the definition of frustration," Dr. Blackmon said.

    Even with the best computers, though, important parts of the climate puzzle still elude both the machines and the theoreticians, although progress is being made.

    Dozens of mathematical models of the atmosphere and things that affect it are being applied to the problem. The most ambitious of these about 20 or so around the world simulate not only the air but also the oceans and, increasingly, other dynamic features of the planet: its shifting sea ice and glaciers, its cloak of vegetation, its soils.

    These imagined earths are generated by supercomputers that tear through decades in a day, creating a compressed view of how the climate might behave if one influencing force or another changed.

    The biggest models have improved substantially in the last few years, with many no longer requiring "flux adjustments" essentially fudge factors that were once needed to prevent the machine-generated, theoretical climates from drifting out of the realm of the possible.

    The signal achievement in recent years has been the accumulation of evidence, much of it from advanced models, that rising levels of greenhouse gases in the air have discernibly warmed the planet.

    But moving beyond that general conclusion presents enormous problems.

    "We will of course improve our models," said Dr. Mojib Latif, the deputy director of the Max Planck Institute for Meteorology in Hamburg, Germany, "but I don't really see the biggest or most important results changing in the next 10 years."

    "In terms of policy," Dr. Latif said, "the models have done their job."

    But the models have not clearly answered a pivotal question: how sensitive is the climate to the intensifying greenhouse effect? In other words, how big is any coming climatic disruption likely to be?

    The models still predict essentially the same wide range that was calculated nearly 30 years ago: roughly an average rise of 3 to 8 degrees Fahrenheit if greenhouse gases double from the concentrations measured before coal and oil burning and forest cutting significantly altered the atmosphere.

    And that is a global prediction. When asked to predict local effects of global warming say, on the Southwest or Europe the margins of error grow, and competing models stray far and wide.

    For example, the change in climate in particular places in the models still varies markedly depending on how programmers start the simulation what values they pick for the initial conditions on earth.

    The first set of numbers plugged into the matrix of equations is always an educated guess, said Dr. Curtis C. Covey, a physicist at the Lawrence Livermore National Laboratory who compares the performance of various models.

    "Can you tell me what the initial conditions were in 1850? Can anybody?" Dr. Covey asked.

    In fact, some top modelers say even the most powerful simulations can be pushed only so far before they reach limits of usefulness.

    Dr. Syukuro Manabe, who in 1969 helped create the first model coupling the atmosphere and oceans, said in an interview that the most advanced versions had already gone too far.

    "People are mixing up qualitative realism with quantitative realism," said Dr. Manabe, who did most of his work at the Commerce Department's Geophysical Fluid Dynamics Laboratory in Princeton, N.J. He is now helping Japan create a $500 million supercomputing center in Yokohama that is expected to dwarf all the other climate research efforts.

    He explained that models incorporating everything from dust to vegetation looked more and more like the real world but that the error range associated with the addition of each new variable could result in nearly total uncertainty. Speaking of some climate models, he said, "They are more caught up in trying to show what a great gadget they have than in showing how profound their study is in understanding nature."

    Of course, Dr. Manabe said, the models still play a vital role in earth science, providing practically the only means of looking into the future, albeit through a cloudy lens.

    And there are still many ways to sharpen the picture, he and other climate experts said.

    First, there is improving resolution and speed. Though climate modelers use the same machines that help nuclear weapons designers and astrophysicists, they still face a big trade-off between detail and time.

    The most advanced models consist of several hundred thousand lines of computer code that divide the air, land and oceans into a grid of hundreds of interacting boxes. As conditions change in one box, the changes ripple through neighboring boxes.

    Until now, modelers had been forced to dice the atmosphere into a grid where each box was about 185 miles on a side. The best ocean models right now are composed of cubes about 85 miles across. The Hadley Center is creating a new model that will take the ocean resolution to cubes about 20 miles on a side, which is detailed enough to capture the important eddies that shunt heat and carbon dioxide from the atmosphere into the depths.

    But Dr. Geoff Jenkins, the director of climate prediction at the center, noted that the three-dimensional nature of the problem meant that each doubling of resolution required a 16- fold increase in computing. In tests, Dr. Jenkins said, the new model "completely clogged up" one of the center's supercomputers.

    Many features of the earth that are critical to climate change remain much smaller than the model boxes so must still be approximated.

    Dr. Blackmon, at the National Center for Atmospheric Research, said features as important as California's Central Valley and the mountain ranges around it remained invisible.

    "We can't tell you anything about what's going to happen there," he said. To do so would require a grid of boxes 19 miles on a side, he said. To achieve that detail would require computers 1,000 times as powerful as those at the research center.

    Dr. Manabe said the goal of the Japanese project, the Frontier Research System for Global Change, was to use vastly greater computer power to accelerate model runs, doing more work in less time and providing a much finer-scale view of what lies ahead.

    The center will have 5,120 linked high-speed processors, able to perform 40 trillion calculations per second. The most powerful computers currently used for climate modeling have about 1,000 slower processors and crunch numbers at about a hundredth of that speed.

    But more brute computing power is only part of the solution.

    Ronald J. Stouffer, a senior meteorologist at the fluid dynamics laboratory in Princeton, said that the key to progress was to move ahead in three realms at once: in the models, in the basic research into the processes that are mathematically represented in models and in the measurements of environmental change that will allow the testing of models.

    "It's a triangle," Mr. Stouffer said. "Observations, modeling and theory. Any one can lead the other two for a while but can't lead much before you get stuck." That leads the climate scientists inevitably back from their simulated worlds to the real one.

    The modelers have been lobbying for more money, not just for their work but also for ongoing measurements of change in the oceans, atmosphere, polar ice and forests. The value of this work was illustrated this spring, many say, when 50 years of ocean temperature measurements showed warming that matched the models' projections.

    Other large mysteries still confront the researchers when they look earthward. Within clouds, for example, the chemistry and physics of the particles like soot and sea salt that form droplets are only slowly being revealed, scientists say.

    A small change in the way droplets form could have a large impact on the climate, said Dr. Jenkins, in Britain. He said that Dr. Anthony Slingo, another scientist there, found a decade ago that in theory, a decrease or an increase in the size of water droplets of just 10 or 20 percent "could either halve or double the amount of climate change you'd get."

    Eventually, laboratory work and observations should narrow that range, many climate experts say, but uncertainty will always remain.

    "The best we can do," said Dr. Manabe, in Yokohama, "is to see how global climate and the environment are changing, keep comparing that with predictions, adjust the models and gradually increase our confidence. Only that will distinguish our predictions from those of fortunetellers."
  • by Anonymous Coward
    ...and a butterfly
    cross the sky...

    Oh!, sh*t!! my prediction !! XDDD

    You can't determine simultaneously the position and momentum of an electron (inf. prec.). And you can't determine with infinity precision the temperature at 7.5m meters above the ground at 8.00000W, 40.00000N, tomorow.
    Nature it's so beatiful!
    Un pepe sin firma - Pepe without sign
    P.D.: None of this words is typing as is, we have a lot of equations to banner to you!!, and more reasons
  • by Anonymous Coward
    You wish.

    As a working scientist, I am almost flattered by the implication. Alas, most scientists are very ordinary in their abilities and knowledge in fields outside their expertise.

    Just because one was a whiz at differential equations, doesn't make one a top-notch economist, philosopher, or a judge of human nature.

    Think idiot-savant, not genius. It's a least closer to the truth.

  • Nawww, it's nothing to be worried about, just some Kansas T-storms that have wandered too far from home. Come back rain, we need you!!

    --
    Hang up and drive!
  • Well, I know the guy in charge of much of the government atmospheric research dollars being spent and let me tell you something about old porky, provincial congress: a case study in why we need to let scientists not polititions run the direction of research. I should preface that the people who decide funding are Scientists themselves and are very technical. So don't think about proposing a new weather control device or such as they can see right past and cookyness at NSF, the place where Weather research gets its funding.

    Case 1: NCAR [National Centre for Atmospheric Research] in Boulder, CO is the nation's formost weather and climate research organisation. They were one of the first customers for the CRAY-1 and if anyone has ever been to the Smithsonian / Air and Space museum in Washington, D.C. you can see one of NCAR's old Cray's there. So, a couple of years ago NCAR was looking to buy a new super computer. They got bids from Cray, Toshiba and NEC. NSF, being the "Money" was then given responsibility to decide which machine they could buy. Carefully examining each proposition, the folks at NSF and NCAR eventually decided that the most bang for the buck would be from NEC [Nippon (Japanese) Electronics Corporation]. Now, you think we'd have had enough of this in the 1980s now that the Japanese economy is STILL in recession, but sure enough the folks at Cray cried foul and laid unfounded accusations that NEC was "dumping" their supercomputer below cost. Needless to say, they got congress involved and convinced one congressman to introduce a bill that would have eliminated pay for any NSF employee that okayed the purchase. HELLO! Try to save the American Tax Payer a little money and they take away your salary? If you didn't know it before, it's corporations that control your money, not congress. Corporations that want you to prop up a clearly failing business just because it is an American company and "we need to protect American Jobs". Look, I don't want my Tax Dollars wasted on Corporate Welfare, and neither should you! As an epilogue, since SGI bought Cray and SGI wanted to do business with NEC, they convinced Cray to drop the Dumping suit. Obviously the no salary bill also did not pass in Congress, thank goodness.

    Case 2: North Magnetic Pole research. As any good Canadian will tell you, the North Magnetic Pole is in Canada. So, if you wanted to do research into how the Magnetic Pole effects weather paterns and such, where would be the best place to build your station? Acording to Congress Alaska. Being that any point in Alaska or anywhere else is so far from the poll that any type of research facilities would be completely useless over there, basically Congress would rather waste money than actually spend it on useful research! If you want to study the Northern Magnetic Pole, you go to Canada, because that's where the Northern Magnetic Pole is. It's a no-brainer. Yet Congress even fouled that up.

    And you wonder why the U.S. is so behind Europe in the weather bis. If we ran it by the Military like most European nations do, we could probably get better results given the Military budget compared to NSF's budget, but then we'd probably not get ANY weather reports because all the research would be classified, like it is in Europe.

    Devo Andare,

    Jeffrey.
  • The top 500 list is currently extremely biased in favour of massively parallel SPP architectures, shared-memory architectures are very underrepresented.

    Moreover the top 500 is based on a very unrealistic benchmarks which basically estimates peak performance - which is not relevant in practice - rather than sustained performance.

    I've heard of a forthcoming revision of the top 500 criteria.
    --
  • They are referring to the Japanese earth simulator:

    <http://www.nec.co.jp/english/today/newsrel/000 5/3001.html> [nec.co.jp]
    --
  • Suffice it to say that even the biased top500 admits that the NEC SX/5, which is the perfect example of a shared memory architecture, can reach an effective performance equal to 96% of the theoretical peak performance, while for a typical SPP cluster it varies from 53% (CompaQ) to 69% (IBM):

    http://www.top500.org/ORSC/2000/sx-5.html

    Real life figures are more like 50-60% (NEC), 10-15% (IBM) and 5% (CompaQ)
    --
  • Your perception of added heat over the last ten years is induced either by drugs or your imagination. The warming over the entire century is measured in the low single digits. You do not feel it.

    The last ice age went from a climate that was significantly warmer than ours today to glaciers everywhere in about a hundred years, one of your geological instants.

    Did you ever wonder why Newfoundland was called Vinland by the Vikings? Anybody who thinks of growing wine grapes there today would be mercilessly laughed at, but it was a natural thing a thousand years ago. The times we live in are abnormally cold. It only makes sense that things would be getting warmer.
  • Interesting article, but it did ramble a bit.

    First of all it moaned about the 450% levy on Japanese super computers, but neglected to mention that a fair number of Meteorological or Climate super computers are Cray T3E's! (including the Hadley Centre mentioned in the article).

    Next it questioned the reduction in grid size as providing an improvement in accuracy, and then spoke about reducing the grid size as being a good thing!

    This article just looks like an attempt to lobby the US government. I can't imagine that 'W' is that keen on climate research!
  • Whether you believe in global warming caused by man or not, surely you can see that taking the steps to reduce greenhouse gas emmissions is just progress in the efficiency of manufacturing. Progress always requires up-front investment. The sooner you invest, the sooner you will get the cost-saving returns, and therefore you get more returns because you get them from now onwards instead of from ten years time onwards. Business is becoming to short sighted these days.
  • The UK Met Office (of which the Hadley Centre is a part) have used US supercomputers for a long time now (and those were chosen via a completely free competitive procurement process). As the original NYTimes article states, they are now recognised as being one of the world leaders in climate research. The modelling code, the Unified Model, a very large and complex model (used for weather forecasting as well as climate prediction) was ported from vector shared-memory C90 to RISC distributed-memory T3E. That required effort, but it shows that it can be done.

  • Wired magazine also has an article in this month's issue (so it won't be online until next month) about supercomputing used in Gene research. IBM is building a 100 million dollar machine that will acheive 1 petaflop in order to attempt to simulate protein folding faster than ever before.

    It's a pretty cool article.
  • Quoth the article:
    Weeklong weather forecasts are generally reliable.

    Do you agree with this? I hardly trust the weather forcast for tomorrow, much less next weekend. The two sites I usually check are The Weather Channel [weather.com] and Weather Underground [wunderground.org]. They often directly contradict each other, differing by up to 15 degrees F and 50% chance of rain.

    Are there any studies tracking the accuracy of various forecasters? If the weatherman predicts rain for three days from now, should I cancel my picnic or just figure that an average July day is 65-85 degrees with a 30% chance of rain? Maybe the weather here in Michigan is more variable than other regions. Do you guys actually trust the predictions in your town?

    AlpineR

  • Heh, as far as I am concerned, if you want em, you can have em.
  • Speaking of the weather... Here in the Boston MA area we've had some pretty damn incredibly thunderstorms in the past few weeks. Now I realize, thunder storms are a natural occurance etc. However, these went on for about four hours or so. And the crazy part is, there was non-stop lightning for that same amount of time. By non-stop I mean a lightning bolt every 2 or 3 seconds or so. It was truly impressive to watch, but also made me wonder. In my 22 years of existance I've never once seen a storm quite like these - are these preliminary signs of global warming, or just a freak storm?
  • We're talking about global climate here, fundamentally - not local climate. How much impact do cities have on the global climate? That's the more relevant question here.

  • each of the sensors is not quite calibrated to work on a holistic sense, that is to say, where the sensor might project a temperature increase or decrease, and a resultant (resultent?) crop change and withering, the problem remains that the prediction is not the final answer.

    My dear AC. You are talking gibberish.

  • Why ? Simply because a lot of money and big businesses are at stake here over this problem. And where there's money... there are a lot of people ready to lie to get it (hello Mr Bush !!!).
  • I don't understand why that graph scares you. It should reassure you that things are not as clear-cut as anyone would like them to be.

    For instance, according to the graph, we have the highest CO2 levels ever in the past 500k years. Why isn't the temperature also the highest?

    The CO2 level has been going up and down, but it went from it's lowest point to a high point in the pre-industrial age!

    There is no "Normal" temperature of the Earth. Sometimes it's hot, sometimes it's cold.

    So, now that we can affect our climate, what temperature do we want? Growing grapes in Greenland? Have our ice delivered to us in Chicago by iceberg? Changing society to remove the CO2 out of the air is just as reckless as pouring CO2 into it if we don't have a goal in mind.
  • The issue isn't whether the weather converges to a stable attractor or not, but whether its exact point on the attractor is knowable or not. Even with the millions of nanobots soaring through the air, we will still not know enough about initial conditions to determine the future very far in advance.

    This is what Gleick was referring to with his famous and often misunderstood reference to the butterfly in Tokyo causing a hurricane in New York City two months from now. He was saying, rightfully, that even the flap of a butterfly's wings is enough to throw off initial conditions enough to make the weather totally unpredictable at some future point.

    The bottom line is that when you don't know the exact initial conditions (unknowable ever, it seems) and you don't know the exact nonlinear equations which make up the model (which do seem to be improving) you are really just making really high-powered guesses about what is coming next. And my grandma with her bum knee can still do a better job of predicting rain than the weather man on tv.

    Crash
  • This small weather forecasting company has been using an MPI setup on a small IBM cluster fairly successfully...

    http://www.foresightwx.com [foresightwx.com]

    They list the type of IBM cluster they're using on that main page... So far they've been doing more with less, and are more accurate than other weather forecasting agencies to boot.

    If you're interested in the latest techniques in weather forcasting for short and long term, check these guys out. They're based out of Boulder, CO, USA.
    Interested in weather forecasting?

  • As others have mentioned, Linpack benchmarks like those used in the Top 500 are largely irrelavent for measuring performance of climate models. Details of this point [nap.edu] are found in the recent report [nap.edu] from the National Academies of Science on Improving the Effectiveness of U.S. Climate Modeling. Massively parallel machines typically achieve 10% of their theoretical peak speed when running climate models while vecter parallel machines (NEC, Fugitsu) typically achieve 33% of their peak speed.

    On a more theoretical note, Amdahl's law states that if a code has any serial portions, then the speedup limit is function of the code and not of the number of processors. For instance, if 0.1% (measured in single processor execution time) of the code does not parallelize, then regardless of how many cpu's you throw at the problem, the maximum speed up compared to a single cpu is 1000. Therefore fast VPP is the way to get more performace.

    Then of course, there's the human resource issue. It's a real challenge to find skilled software engineers and then to convince them to accept low-paying government salaries.

  • America has some of the best, fastest, largest, and most powerful supercomputers on earth. Why aren't these being used by the scientific community? A couple of reasons... Any supercomputing installation that receives funding from the Federal government (ie, most installations) is required to allocate some percentage of available runtime (I think it's about 30% or somesuch) to the Department of Defense for classified research. I discovered this when using the University of Alaska Fairbanks's 'triforce' of Crays at the Arctic Regional Supercomputing Center.

    So what is all that time allocated to the DoD spent on? And why are the nation's largest supercomputers not used for general scientific research, such as weather, genetic research, high-energy physics? Because the most important thing for America to be spending its supercomputing cycles on is the modeling of nuclear weapons. Since we're not allowed to blow up small tropical islands anymore because Japanese fishermen don't like being randomly irradiated (you'd think they'd be used to it by now, after all didn't Hiroshima and Nagasaki make the point that they'd forever be our test subjects?), we spend as much of our cycles as possible modeling those explosions in massive chunks of supercomputing time.

    Maybe someday we'll be able to save the Known Universe from invasion by evil aliens because of our advanced understanding of the explosion of nuclear weapons. We'll be happy we didn't waste our valuable supercomputing cycles on something so silly as modeling climatological systems, processing astronomical data, working on cosmological models of the Universe, or dissecting the world of quantum mechanics. After all, having big guns is what makes America great!
  • You imply that the "top 500" web site is complete. As of last check, I knew the location of machines that were not on that list, and there must be ones I don't know about.

    Just because there is a website devoted to something doesn't make it canonical.
  • We know from chaos theory that a butterfly batting its wings in France can eventually cause a hurricane in the Atlantic, so the trick is to harness enough supercomputing power to analyze the flight paths of all the butterflies in the world, and use that to calculate the effects on weather patterns around the globe. Isn't science great?

    --

  • CO2 can hardly be considered a greenhouse gas

    Bollocks. CO2 is the greenhouse gas. Its effect isn't as strong as some others, but it makes up for it by being present (and being produced) in vastly bigger quantities.

  • You can bet there are machines not on that list which should theoretically be on it. Think NSA. Think French A-Bomb simulations.
  • I said French, not USA. I really, really doubt the French would entrust their research to other nation's computers.
  • Hm, interesting approach. Of course, it means that they won't be able to use as detailed a model as the Big Guys, since each one has to terminate in a reasonable amount of time as a slopsucker on a consumer system...
  • I doubt this is ever going to be really useful. Climate simulations are not the kind of problem that can be efficiently parallelized over a network with extremely high latency and low bandwidth as the internet.
  • Exactly. When you have a differential equation which is unstable with respect to the boundary conditions you typically get solutions with exponential divergence.

    Very roughly, that means that you lose one bit of precision per iteration. Therefore to produce accurate long-term weather forecasts you will need observation made with hundreds of accurete digits after the decimal point, which is quite absurd.

  • Simple ideas for simpletons. Too bad climate isn't so neato simple. The population of the sub continent of India EXHALES more CO2 than the US expels in burning fossil fuels. Shall we nuke India, with the added benefit of a mini nuclear winter? That should do wonders for cutting the whole 2-5% of global CO2 emissions that humans are accountable for.

    Did you know CO2 has been building up in the atmosphere for the past 150,000 years? And we're still far below the level of atmospheric CO2 that was present in past signifigant warm events. From ice core studies, the evidence points to warming occurring BEFORE CO2 rises, not AFTER. The planet is warming, big deal. It was warmer in 1100AD than it is today. That corresponds, by the way, to an explosion of agriculture worldwide. Including in the Southwest US. Global warming is a natural process. Get over it. We'd have to cut CO2 emmissions to 18th century levels to even DENT the computer models. And for what? Questionable conclusions that humans are warming the planet? No thanks.

    Derek
  • Global warming is a natural process. The only concensus among scientists is that warming is occurring. No one believes the worst case scenario of a 5 degree C change will happen. What IS controversial, is whether humans are responsible. The science on that point, is highly debatable. Most of the warming of the past century occurred before 1940. The ground based readings are questionable because of the rapid urbanization of the northern hemisphere coupled with the heat island effect. Many of the ground based stations which used to be in rural areas are now surrounded in sprawl. Meanwhile, balloon and satellite observations of the upper atmosphere point to tropospheric *cooling* which runs counter to the predictions of the models. Fact is, we don't know nearly enough. But we only hear from politicians and special interests who don't have the responsibility of the majority of scientists who know enough to keep their mouths shut until the burden of proof justifies otherwise. We only hear from the same scare mongers who said in the 1980's that we were in for an imminent ICE AGE, both in the private and public sector.

    Here's just one of many who are raising real questions.

    http://www.osu.edu/researchnews/archive/nowarm.h tm

    Derek
  • Here is a better link to the story (no reg needed):
    http://archives.nytimes.com/2001/07/03/science/03C LIM.html [nytimes.com]
  • Hell, I'm in England and we've had nearly a fortnight (two weeks) of sunshine. I can only conclude that, yes, it is the beginning of the end.

  • The best supercomputers aren't here because Americans are some superior people. They are here because 1) we have the money to buy them 2) we need them for our nukers 3) American firms dominate the CPU market.

    Oxford is in the U.K. Our last president went there to study. Stanford is in California, which may or may not be a foreign country.

    Check your nationalism at the door when you talk about science. How many of the "American professors" other countries are "beg"ging for are of non-native extraction? American postwar science was built on the knowledge of refugee scientists. Werhner von Braun? Enrico Fermi? Edward Teller? von Neumann?

  • actually these are already in use. In particular to model things like huricanes or tornadoes where the roi (region of interest) is moving. I remember reading a ddj article about this stuff a few years ago. Also I think this is available in MM5. See http://www.mmm.ucar.edu/mm5/
  • You would expect the vice president of the most powerful country in the world to be driven around in an escort?

    I suddenly find myself wondering what kind of mileage the Pope-mobile(tm) gets...
  • Until now, modelers had been forced to dice the atmosphere into a grid where each box was about 185 miles on a side. The best ocean models right now are composed of cubes about 85 miles across. The Hadley Center is creating a new model that will take the ocean resolution to cubes about 20 miles on a side, which is detailed enough to capture the important eddies that shunt heat and carbon dioxide from the atmosphere into the depths.

    I wrote a reasearch paper in college for about Global Warming. One of the major problems is that all the models were using these 200 square mile matrices. So it is either raining or not raining in a 200 square mile area.
    No matter what else you do, you have an innacurate model at that point. Rain simply does not occur in 200 mile squares. It happens in small, localized patches much of the time. Thus, it is really not kown how rain would be affected by, or how it would affect, global warming.

  • Picture the Cray T3E's as very large Beowulf clusters, with extremely nice network connections between them.

    The IBM SP's seem like the weird hybrid...with multiple CPU's per "node".

    What made Cray money was their compilers. The auto-vectorizing capability made serial code run fast with no work. There is no "auto-Beowulfing" compiler yet. For work in that area, look for "co-array fortran", or UPC "unified parallel C".

    But since compilers are black boxes, that I inherently don't trust because I don't understand, if you want to get work done, the code should be handwritten using MPI and OpenMP. Of course, your mileage may vary.

    And the question for the day: which way is the net transport of CO2 at the atmosphere/ocean interface? Is it into, or out of? Last I knew, we didn't even know that!
  • Greenland was green when it got its name, folks

    No it wasn't. This was an early example of false advertising, aimed at persuading more Vikings to settle there.

  • More cars with less miles-per-gallon gives you more CO2, which drives the greenhouse effect. Not that complicated, huh? Should be simple enough even for George W :-)
  • to get into the NY times site, without registering;

    Login: cyberphunk

    Password: cyberphunk

  • If complexity of parallel programming is the only factor affecting this important area of research, I think more and more computer scientists and programmers ought to be involved in computational science.

    I think Computational Science has suffered a great deal in the 1990's because of all the smart computer scientists and programmers hitching on to the dot-com bandwagon, and opting for all that showy stuff (which was bound to flop) instead of getting deeper into good solid research such as reducing parallel programming complexity.

    charmer
  • Distributed computing approaches are fine for computing small chunks of independent information. That's why it can work for SETI@home... You don't need to know what's happening in the other data sets to do the calculations. Each data set just needs to be analyzed independent of the others.
    The problem with weather/climate modeling is that, for each time step, a huge amount of data is needed to compute everything and that local changes end up not being as local as you might think. An action in one part of the atmosphere can affect the atmosphere elsewhere. In other words, the atmosphere is an internally coupled system. You can't pull it apart spatially, at least in any sort of reliable fashion. So, if you wanted to distribute the computing, you would have to give each client a buttload of previous data for it to do it's calculations, on the order of tens to hundreds of megabytes. Then the computing power required to divide up this data, retrieve it, and place it in its proper position would be comparable to the original task's power.
    Essentially, the problem is too interconnected and complex for distributed computing, at this point, to be useful. It's not easily pulled apart into simple, discrete computations. These models are some of the most complex algorithms out there currently.

    -Jellisky
  • More specific info on the requirement for the Casino-21 [climateprediction.com] pilot client are here [rl.ac.uk]. No *nix client so far: "The current plan is to begin with a rough Linux port for in-house testing and possibly an alpha-release version. Based on registration results so far, the next port would be to a Windows platform (98 or NT). "
  • Heh...say hi to Agbeli for me...I went to school with him. And yeah, MPI works fine for machines running relatively new weather forecast models. I currently run an MPI-based LES model for stratocumulus research. Long-term climate models vs. weather forecast models, however are completely different animals...especially when you consider the fact that we've been doing climate models for many, many years - way before the message-passing interface was dreamed up. Like the previous poster (from Argonne) mentioned - converting the old dinosaur code to newer machines is _not_ a SMOP. So MPI might be the way to go with newer weather models, but not be okay for climate models. See? cheers,

  • Bollocks. CO2 is the greenhouse gas. Its effect isn't as strong as some others, but it makes up for it by being present (and being produced) in vastly bigger quantities.

    Bollocks. :) CO2 is a greenhouse gas, but water vapor is by far the greenhouse gas with the greatest impact, insofar as it present in quantities to make CO2 insignificant, as well as in a radiative transfer sense. Check out the excellent book "Contemporary Climatology" by Ann Henderson-Sellers and Peter J. Robinson for more info.

    Cheers!

  • I thought that any complex iterative model, no matter how accurate, is no better than a wild guess, according to chaos theory. And I should know, I've seen jurassic park.
  • I seem to recall having read about this over a week ago at Kuro5hin [kuro5hin.org], so I'll make the same comment here that I left there:

    Why not make this a distributed computing task, similar to Seti@home? Just a thought...
  • Three of them are DEDICATED to weather or environmental work (Naval Oceanographic Office...

    Don't be so sure that that's for environmental work. Think undersea warfare..

  • Consider that all the critics of climate change are privately funded, often by the big industries with a vested interest in denying global warming. Who is compromised now. In fact, George Dubya acknowledges that global warming is a fact, but that since we can't predict it accurately, well, hey, lets just play ostriches.
  • As we now have good models for why CO2 should cause temperature change,

    Do we? Those models aren't based on the greenhouse effect of CO2 (which is easy to predict, and which apparantly can't raise world temperatures by more than a fraction of a degree). They're based on an assumed positive feedback from the greenhouse effect of water vapor: the idea being that that fraction of a degree rise from CO2 will increase evaporation from the oceans, which will put more water vapor in the air to cause a *real* greenhouse effect.

    Needless to say, this is a hard theory to quantify; hence the need for all the supercomputers.

    but not the other way round,

    And this is just wrong. Ever opened up a hot can of soda?

    There are currently [colorado.edu] 720 billion tons of carbon in the air. Sound like a lot? It's nothing compared to the 39 trillion tons of carbon in the oceans. And when you heat up the oceans, what happens? Same thing as in your soda: the solubility of carbon dioxide changes, and CO2 is released into the air. If the temperature goes up, the ocean releases CO2. If the temperature goes down, the ocean absorbs it.

    So which is happening? Do temperature changes cause CO2 level changes, or vice versa? Hell if I know. I used to believe the latter, now I'm leaning toward the former. The most convincing piece of evidence I've seen is a paper (published and presumably peer-reviewed by Science, although it's been quoted by quite a few more biased sources since then). Atmospheric CO2 Concentrations over the last Glacial Termination [unibe.ch] has another graph showing those scary CO2/temperature correlations over about 12000 years... but with some less scary conclusions. It seems in their ice cores, CO2 changes lagged temperature changes by 800+/-600 years.
  • Good thing Al "Earth in the Balance" Gore was saving the planet all his years in Congress and the White House by being driven everywhere in a 4 mile-per-gallon limo eh?
  • I suspect they know a hell of a lot more about nonlinear dynamics (including "chaos theory") than you do. For instance, turbulent flows are generally chaotic, making it impossible to predict the path of a given particle for more than a brief amount of time. On the other hand, the resultant mixing can be easy to study statistically on a larger scale, and it is entirely possible that the longer the model runs the more accurate the aggregate results. Stated differently, it is entirely possible (even likely) that a system that is divergent when modeled on a small scale is convergent when modeled on a large scale.

    Climatological modelers were among the first to realize the implications of Lorentz's results. They, and most other scientists in fields involving nonlinear dynamics, have spent the last three decades digesting and incorporating both the results spawned by his discovery, and earlier "forgotten" results like those of Poincaré. That's not to say all climatologists have sufficient understanding of chaos, but the community as a whole has long been aware of it.

    As to the politics of the situation: just follow the money -- as usual.

    -Ed
  • I am a physicist with a strong background in chaos theory and I am also one of the team who ported (but not yet optimized) the modified Boulder model on the NEC supercomputer in Lugano/Manno (not Bern, as incorrectly stated).

    So yes, the effort is multidisciplinary and - at least in Switzerland - climatologists are well aware of the implications (many of them are physicists).

    See <http://www.climate.unibe.ch> [unibe.ch] (I hope that NCCR/Climate gets a real website up and running soon) and <http://www.cscs.ch> [www.cscs.ch]

    --
  • The jury is in. The decision is done.


    The above statements are false. As someone who works for a weather company and hears a lot of theories, it still very much in question whether manmade greenhouse gasses are causing global warming.



    Our current evidence suggests that increased surface temperatures are more likely caused by increased development (ie, asphalt) nearby ground measurement stations. Also, cyclical sunspot activity results in a curve of zipper-pattern fluctations in the radiated energy that reaches the earth over ~11 year period. The above two observations *do* have substantial physical evidence to back them up, and are a better explanation for recent global warming than the greenhouse theory.

  • by sharkey ( 16670 )
    It must have already been submitted, since the article is over a day old

    That won't stop a /. editor. Even if it had been posted already today, it still wouldn't stop a /. editor from posting it again.

    --
  • I haven't worked in this field for 5 years, but even then the computer models were fast enough to run the models several times with slightly different inputs. This is a standard technique to determine the sensitivity of the model to small errors in the initial values - classic chaos theory stuff.

    What happens is you find the vast bulk of the model output is essentially the same. The variability is in the exact location of fronts, exactly the type of stuff that has always been difficult to predict.

    Given a long enough time frame, everyone will fall under this uncertainty. So you still can't make long-term forecasts, but you *can* give decent 7-10 day forecasts if you have the flexibility to occasionally say that it's impossible to forecast the weather on some of those days. In the vast majority of cases that's good enough - it allows people to avoid scheduling activities when the weather is likely to be nasty.
  • Wow could you please give us a brief synopsis of your research papers which back up your theory. You must be published in science or other scientific journals if your research is that solid. I would love to read your articles and am curious to know where you got your meteorology degree and what your phd dissertation was on.

    You must be a very well informed and regarded climatologists to be so absolutely sure of your facts like this I look forward to reading your articles and resume.

    Thanks.
  • The middle of last month (JUNE!) it snowed 14 inches bozeman montana.
  • You would expect the vice president of the most powerful country in the world to be driven around in an escort?

    Idiot.
  • The debate is centered on whether or not man or natural processes (cycles of flora and fauna, volcanoes) are driving the current trend. I have not seen any convincing evidence to support the existence of anthropogenic phenomenon, and plenty to support the existence of natural phenomenon.

    Care to present any of this evidence? I haven't heard of any natural processes which would significantly raise the amounts of greenhouse gases in the atmosphere over the last 50 years.

    BTW, I though the debate was mainly about how the climate will change and how screwed we will be when it does. What caused it may make interesting research, but we aren't going to be able to stop it, whether it's anthropogenic or not.

  • OK. You are right.

    And the National Academy of Sciences (US) report on global warming, drafted by the senior US scientists in the field, is wrong. As is every other major collaborative meta-analysis of existing evidence.
  • You point out that the EPA and UN-funded scientists have found evidence of global warming. Notice where their funding comes from. If Exxon was paying the bill, these same guys would no doubt have found the opposite. Government and industry researchers don't get tenure.

    The UN didn't FUND the scientific research, only the report.

    There actually is very little debate even on this topic anymore, either in the scientific community or in the VAST majority of world leaders (Dubya being an exception). The northern hemisphere is warming. Human generated gases are a principal cause. Even with climatic fluctuations, the changes in the last 50 years are ring clear as a bell. The situation is similar to that about 10 years ago, when everyone except the tobacco producers were claiming that nicotine was addictive. Well, guess what - it is the most addictive substance on the planet.

    No one stands to make a tenth the money from global warming as the energy companies stand to make from selling energy. The bottom line is that there is really no way to get out of it without using clean energy or less energy. Clean energy costs more, and less energy stunts growth. Either way there are economic consequences.

    But you have to think about the consequences of failing to act quickly enough. New Orleans is under sea level. So is Amsterdam. Virtually all the US east coast cities south of DC will be under water - places like Virginia Beach, Charleston, Savannah, Miami. And that pales in comparison to what will occur in low lying areas in Asia. (Note: so far Antarctic ice is unaffected. If it continues to be unaffected there will be no oceanic rise of note)

    Besides that - entire ecosystems in the northern hemisphere are shifting climate so rapidly that native species do not have time to migrate further north. The amount of warming in the last 50 years can be translated to a shift in latitude. And in some areas, notably Alaska and Siberia, these shifts are killing native species.

    I think if you do your research a little and read the works of the scientists who study climatic change, you would have very little doubt indeed about whether there is northern hemispheric warming or whether human generated gases are at least substantially to blame.
  • Our current evidence suggests that increased surface temperatures are more likely caused by increased development (ie, asphalt) nearby ground measurement stations.


    There are many different ways to come to conclusions regarding things like global warming. That is why Bush asked the National Academy of Science to create a report. In this, the NAS gives academic independence to its members who work in the field of climate change. The committee was made up of 11 of the nation's top climate scientists, including seven members of the National Academy of Sciences, one of whom is a Nobel Prize winner. You can note that they do not support the stance of Bush that the evidence needs to be further evaluated before taking substantial action, which indicates that the source of funding for the report is not biasing the results.

    http://www4.nationalacademies.org/onpi/webextra. ns f/web/climate?OpenDocument

    They note that greenhouse gases are increasing. CO2 is mostly to blame. It is mostly human generated. One of the most compelling pieces of evidence is the cooling of the stratosphere. Urban warming lacks adequate explanatory power. But don't believe me - read the report.

    Note that this is one way of forming an argument - relying on the consensus opinions of experts in the field. You could similarly rely on the opinion of a rogue in the field that others do not agree with. Sometimes the loner is right, most often the consensus is right.

    Any grassy knoll believers ?
  • Actually that makes me wonder.

    Does the Heisenburg (?) uncertainty principle apply at a macroscopic level ?

    If we observe the weather, can we do so without modifying it ?

    fwiw, i've always wanted to somehow work that into my defense whilst protesting a speeding ticket..

    "boy, i clocked you doing 77mph back there"

    "do you know which way i was going ?"

    "yes"

    "then you couldn't possibly have known how fast I was going"

  • My bluetooth SETI@HOME comment was really about the problem of data gathering. If people can buy a bluetooth weather station module and put it on their roof/in their garden/in the house we could see a huge increase in sampling. Yes we still need more sampling at higher levels, but it is about evolving greater sample spaces to match our increasing processing power. As a further response suggests we could make radical advances in the future with nanoprobe technology (resistence is futile). Either way I still believe the current "boundary" of predictability will be broken in my lifetime.
  • You are a quiter! Simple as that! I fully expect to see major advances in weather prediction techniques in my life time (I suspect I have 25-100 years to go :-). Take Moores Law, the internet and human wisdom ... I have hope. Perhaps they will make a Bletooth devices that lets you contribute to the new SETI@HOME and monitor your local weather. Perhaps the Newtonian-Motzart of the digital age will come up with something way out there that leaps us forward. Chaos theory exists, however where is the threshold of understandability? Can we understand the system well enough to know where it is? Are we sure nothing will make a difference?
  • The IPCC report is a sham. The scientific conclusions were ALTERED to match the political conclusions. This is a fact, which caused much crying of foul in the scientific community. "Nature" was caught in a hamstring over the scientific integrity of the report was questioned. Specifically, sections of the report which clearly stated that the models were imperfect and could not with any certainty predict future climate events were REMOVED from the report.

    The current climate models are not even predictive - they can not even recreate past climate events, let alone predict future ones.

    As to the report put in front of G. Bush, its an even bigger fraud. The scientists who signed the report DID NOT WRITE IT. It was written by functionaries. Several of the signees were global warming sceptics, who do not support anthropogenic climate change hypotheses.

    Derek
  • ...to blame it all of the damned butterflys...

  • Looking at the article:

    http://archives.nytimes.com/2001/07/03/science/0 3C LIM.html

    I can see that they're standing beside many racks of SGI Origin 3000 gear (whether it's a single machine or many smaller machines depends on the cabled configuration -- O3K uses a mesh of cables rather than a backplane for its third-generation ccNUMA architecture).

    At any rate, I'm curious as to who company they'll be buying their new system from. The only clue I can gather is their mention of 5120 CPUs to churn out 40 TFLOPS... that's 7.8 GFLOP per CPU which pretty much rules out SGI or even the latest Pentium/Xeon, Itanium, UltraSPARC III, or Alpha (unless one uses a very Apple-esq SIMD benchmark but I would imagine they want and need something more flexible for true performance with SISD, SIMD, and MIMD combined, rather than just bragging rights for a high single benchmark and top spot on Top500). My guess is they're going with either the not-yet-released Cray SV2 (which will combine the parallel strengths of the T3E with the vector strenghts of the SV1ex) or NEC's successor to the SX-5 (SX-5 is marketed by Cray in the USA, but under its creator, NEC, in other countries).

    Anyone have additional insight?

  • Nobody had the technology in 1950, and I doubt we do now, to measure the temperature around the globe and get a global, annual average accurate to a tenth of a degree. Forget it.

    Three words, buddy: satellite remote sensing. You're right about one thing - we didn't have that technology back in the '50s. But starting with the TIROS satellites, that all changed. Currently, there exists a plethora of satellite sensing systems which not only measure temperature, but moisture content, cloud cover, and a whole slew of other relevant parameters, all with reasonable accuracy.

    One other thing that bugs the crap out of me about "global warming". NOBODY EVER TALKS ABOUT CONCRETE, STEEL, AND ASPHALT!!! Hasn't ANYBODY ever noticed how hot a street or roof gets in the sun?!? I expect a few temperature measurements in growing cities would be more than enough to throw off their temperature measurements.

    Try this on for size:

    http://directory.google.com/Top/Science/Earth_Scie nces/Meteorology/Urban_Climate/

    Just because you're not listening doesn't mean we're not talking about it.

    Greenland was green when it got its name, folks.

    Actually, Leif Eriksson named the island "Greenland" because he thought people would want to move there if the island had an attractive name. Read "Greenlander's Saga" for more information.

    Concidering that planktin, not rain-forests as the greenies would like you to think, fix something like 70-80% of the CO2 in the atmosphere, it would appear that the earth is more than capable of absorbing whatever increase in CO2 we're providing.

    Care to post a reference? IANABiologist, but I would assume that water-borne plankton would absorb CO2 from the ocean - so the ocean (not the plankton) would have to provide the additional uptake of CO2 for the plankton to make a difference. But I could be wrong - I'm not really good with fish. :)

    There are two things to consider when talking about the climate: one, that the climate has definitely warmed in the recent years. Two, we have yet to figure out why. Certainly climate models show the impact of CO2, but they're designed to do so. Other climate models use different, more natural phenomenon to produce the same warming. Essentially, if you want to say that "X causes global warming" it's possible to create a model that will indeed show that X causes global warming. Climate modelling is just too complicated to come up with a definitive answer. But, given the possible outcome of anthropogenic climate change, do we simply ignore the problem as a Chicken Little scenario, as you tritely suggest?

    Buy my book! I just lost a fortune in tech stocks and I need money!

    Hehehe...did you invest all your money in DrKoop.com? :)



  • IBM has already been doing top notch work with the National Weather Service in forecasting. They worked with the NWS to develop a modeling system called Deep Thunder that could provide highly accurate predictions for a local area (25 miles or so). They apparently used it during the 1996 Olympic games in Atlanta to ensure that the closing ceremony would not get drenched. Read about Deep Thunder in this Wired article [wired.com] and on IBM's web site [ibm.com].
  • I think the moderators have demonstrated quite effectively the reality that modern weather/climate science is almost completely driven by the environmental extremists whose chief research tactic is to shout -- no scream -- down anybody who discovers something that doesn't fit into their neat little global warming agenda.

    I'll start to believe the possibility of significant human contributions to global warming and the so-called greenhouse effect when I see intelligent discussions of viewpoints that disagree with the current wacko political agenda.

    The first question I'd like to see addressed is how exactly the earth emerged from the last ice age without the assistance of the internal combustion engine.

    I was going to post anonymously, but then I got a grip and realized that all I'm going to suffer is the loss of a couple karma points on /. Like I could care.



    The best diplomat I know is a fully activated phaser bank.
  • actually, the first poster is entirely correct - although I cannot recall the equation of the uncertaintly principle it still holds true in the macroscopic world, it's just that the wave functions of macroscopic objects (ala schroedinger equation) are far greater than their associated uncertainty. The process of the police observing your speed does change your direction (solid objects also have wave functions, they are just far less significant than in elementary particles), but to such a small extent that it not appreciable:

    The radar pulse interacts with the molecules of your car, imparting a small degree of energy coming in from a certain vector (like a gentle breeze) which does change the direction of the car but to such a small degree that it is totally undetectable, and only of mathematical curiosity.

    -Nano.
  • as if there is scientific consensus

    IMHO, there is already a scientific consensus on both of the things you mention. An overwhelming majority of the climate scientists is convinced that we are responsible for the climate change. I believe that those few who disagree are industry lapdogs.

    What we don't have is a political consensus on reducing the emissions. Scientists are not listened, especially when the right thing to do would be expensive and unpleasant. (I think GW Bush has not nominated a science advisor yet. If he has, please inform me.)

  • George Dubya admits there is global warming, just that we don't know how much. One of the predictions of Global Warming is that weather patterns will shift, which is what appears to be happening in your case, and also here in Melbourne, Australia. We have been having five years of drought in what is normally very wet Melbourne, because predicted weather shifts are happening. The only thing they didn't expect, according to my weather scientist friend, is that this would happen so quickly and so soon.

    This has been due to high pressure systems from up North coming down South.

    The other interesting thing is that insurance companies are now very wary of insuring many areas susceptible to tropical storms. FLorida actually had to force insurance companies to insure people who insist on building in dangerous areas. In Australia, there are now areas where insurance rates have gone up hundreds of %, and some areas just cannot get insurance.

    In fact, Hurricane Andrew drained several billion dollars from Australian re-insurers and sent then broke, Not again!

  • That are significant in terms of the list. For example any that would even make it in the top 100? Ya, sure, I know of supercomputers that aren't in the list but they wouldn't even make it (much less rank high enough to make a difference). Now if you happen to know the location of a teraflop unit not on there than please, enlighten us.
  • Well, let's see there are a few that are listed in classified locations, HOWEVER I can tell you where all the big nuclear simulations are done. Los Almos Labs on the ASCI computer (White, Red, etc). They do all sortf of things including weather perdiction, population models and, of course, nuclear research. It's public knowledge where it happens, they just don't tell you what the results are :)
  • Yes, the Earth is warming in some areas, e.g. Siberia. But, this is totally expected if you look on a geological timescale, vis-a-vis the Ice Age cycle.

    It's warming right here in the USA. I don't need a scientist to tell me this; when I step outside the front door it feels hotter over the last 10 years.

    10 years is not a geological timescale. On a true geological timescale, a century looks like an instant.

    It's likely that 100 million years from now, the effects of the human race plotted against time are going to look indistinguishable from the effects of the comet that wiped out the dinasaurs. Given human nature, I doubt that there's even anything that can be done to avoid that.

    At the very least, we all get to witness one of the biggest events in the earth's history, like being a passenger in a huge slow-motion train wreck.

  • by RobertFisher ( 21116 ) on Thursday July 05, 2001 @02:23PM (#107774) Journal
    (First, a preface. I'm a member of a computational astrophysics research group [berkeley.edu]. We have ported our codes to the kinds of hybir d architectures of the machines discussed here, and have benchmarked their performances. Moreover, we have previously run on vector machines, so we have a fair idea of the pros and cons of the two approaches.)

    While zavyman points out the basic problems inherent in parallelizing any discretized numerical model, the problem in obtaining good performance on hybrid architectures like the IBM SP-2s and SGI Origins which currently top out the top 500 list goes much deeper.

    First, these machines are built around a hybrid architecture. Each node has a few processors (typically between 4 and 16, depending on the model), which utilize shared memory. These nodes connect to one another via an internode interconnect, with relatively modest bandwidth.

    While this hybrid architecture allows supercomputer manufacturers like IBM and SGI to scale into the thousands of processors, it also introduces a substantial complexity into building of high-performance codes. Ideally, one would like to run threads-based parallelization on each node, and MPI between nodes, though the reality is that most codes in use use only MPI.

    One can get decent scalability (into the hundreds of processors) when one runs physical models with limited communication -- ie, which simulate hyperbolic PDES like those of hydrodynamics (as zavyman describes above). However, things become more interesting when one considers more varied physics, such as that involved in solving elliptic PDEs (such as Poisson's equation for self-gravity or electrostatics). Because elliptic equations connect everything with everything else on the spatial domain, the communication costs ARE MUCH HIGHER. It is extremely challenging to build a multiphysics code with such varied parallelization demands. Indeed, it is a fair statement that no one has yet achieved excellent performance on anything close fo the thousands of processors available on these hybrid machines. For instance, another poster describes a climate model available from another research group. However, if you dig deeper, you find that they state,

    "ForesightWX uses an IBM 12-node system with 52 processors working 24 hours a day. The cluster fits snuggly in a small room. A decade ago the same power would have filled the building."

    52 processors is a far cry from the thousands of processors available to the users of these machines. Since each processor is slower than a vector processor like the Cray (by about a factor of 3 - 5), and assuming ideal speedup, such modest levels of parallelization lead to speedups of about 10-15 relative to a single Cray T90 processor. It is quite evident that there is little net gain over running the same simulation on 8-16 T90 nodes.

    Moreover, due to the hardware constraints described above, IT MAY VERY WELL BE THE CASE WE NEVER SEE EXCELLENT MULTIPHYSICS PERFORMANCE ON THEM.

    (One can get better parallel performance by increasing the problem size, but as the article points out, doubling the resolution of a simulation increases the cost by a factor of 16; hence, simply increasing the problem size may lead to unacceptably long computation times.)

    I think massively parallel architectures will ultmately be the wave of the future, but there is little getting around the fact that the current generation of IBM-SP2s are dogs in the performance category.

    Bob
  • by zebedee ( 26823 ) on Wednesday July 04, 2001 @11:10PM (#107775)
    This is a monte-carlo approach (hence the name Casino-21). Each machine will run a completely *independent* climate simulation with no interaction with anyone else's machine. The point being that each simulation is set of with a slightly different set of options on the "control dials" of the model. The big ensemble of results will then help scientists determine the sensitivity of the climate to different effects.
  • by CharlieG ( 34950 ) on Wednesday July 04, 2001 @06:53PM (#107776) Homepage
    OK, Moore's law will solve a lot of the problems, but not all of them. One of the big problems is gathering input data! We have this huge system to model, and we only have datapoints every few hundred miles. The air column goes up 10s of thousands of feet. Even if the govt put a gound station on a grid of 10 miles on a side, you still have to send up weather ballons to get readings of the air column (Temp, humidity, winds aloft etc) all the way up.

    So, there are HUGE holes in the data. Makes it hard to make a model
  • by JPMH ( 100614 ) on Thursday July 05, 2001 @03:35AM (#107777)

    This graph [economist.com] is one of the scariest things I have seen in a long time. It's a plot of the temperature variations and CO2 levels over the last 500,000 years measured from ice cores drilled out from Lake Vostok in the Antarctic. The two series track each other incredibly closely.

    As we now have good models for why CO2 should cause temperature change, but not the other way round, it is something to take very seriously.

    The figure was taken from The Economist [economist.com] magazine, a paper not usually associated with extreme anti-business views. Two recent articles gave good summaries of our present state of knowledge about global warming, and how both the data and the models have improved over the last ten years:

    (Titles given are those used in the magazine's index [economist.com] of its environmental stories online.)

    One worrying new possibility is that there may be an abrupt change (bifurcation) in the ecosystem response as the temperature rises. At the moment about 50% of the manmade CO2 emissions are being absorbed by the Amazon rain forest. But the latest Hadley Centre models predict that if the temperature continues to rise, this greatly increases the frequency of much drier weather in this region, causing the forest to dry out, ultimately leading to uncontrollable forest fires. This would release vast amounts of more CO2 into the atmosphere if the whole lot went up -- perhaps ten times as much as human activities.

    (And that is not the ultimate nightmare positive-feedback scenario, which is the enormous amounts of methane hydrate locked up at the bottom of the ocean in the arctic permafrost. The only thing that keeps it stable is the high pressure and low temperature. There is thought to have been a runaway destabilisation 55 million years ago, which raised the temperature 15 degrees C in less than 20 years).

    I suppose somebody might come up with a techno-fix solution. But the complacency of gambling on that is like playing Russian roulette with five of the six chambers loaded.

  • by nels_tomlinson ( 106413 ) on Wednesday July 04, 2001 @10:43PM (#107778) Homepage
    I looked into this a few years ago. What I found was that the models predict a lot of stuff that just isn't happening; changes in weather patterns, huge increases in daytime high temperatures (up to 5 degrees C!), and so on. That suggests that the models suck, and there seemed to be no reason to think they'd work on the stuff we can't observe, when they don't work on what we can observe.
    I dount that the situation has changed remarkably since then. One thing that I'm sure hasn't changed is that there is no shortage of really solid data to support both sides: that the temperature really has risen, and that it really hasn't. There are thousands of temperature time series, some direct and some inferred, some are climbing, some are falling, and most aren't changing significantly after controlling for all the relevant sources of variance.


    Globally it is likely that the 1990s was the warmest decade and 1998 the warmest year recorded (since 1861). Certainly this seems to be the case in the northern hemisphere not simply since 1861 but for the last ten centuries.


    Yep, I hope so. We are still coming out of a little ice age, returning to the higher temperatures which were the norm when the Vikings grew grapes in Newfoundland. The scary thought is that we might find out, in 100 years, that the temperatures are really going down.


    You point out that the EPA and UN-funded scientists have found evidence of global warming. Notice where their funding comes from. If Exxon was paying the bill, these same guys would no doubt have found the opposite. Government and industry researchers don't get tenure.


    There are literally thousands [sepp.org]of responsible scientists who work in these fields who believe that any sort of costly action to "avert global warming" is a bad, irresponsible idea. Some of them are Exxon employees, but certainly not all. Here [junkscience.com] and here (loosely related) [mit.edu] are a couple of random links which might help make the point that it isn't a settled issue in the minds of people who understand it and aren't funded by the Government or Greenpeace (HINT: both these groups find it easier to get money from the public if they can claim that the sky is falling.)

    In short, ad homenim arguments are less productive than usual here, since we see the usual suspects on each side of the issue. The energy companies are pushing their issue, Greenpeace is pushing theirs, and so on.
    We need to consider the consequences of being wrong. Seeing the global temperature rise by 1 to 2 degrees C is probably going to make the world a better place to live in the long run. That's the maximum likelihood prediction from most of the models that folks on either side take seriously. The doomsday 5+degree C senarios have very low probabilities under most models.
    Consider the cost of "taking action": Millions of people around the world, most of them already desperately poor, will die earlier and more miserably if we do anything to limit energy use. The only thing I can think of to reduce greenhouse gasses without causing disaster is replacing coal with nuclear power. That isn't going to happen anytime soon, unfortunately, because of the same agenda that is driving the "its getting hotter" side of the issue.

  • by kabauze ( 145859 ) on Wednesday July 04, 2001 @06:53PM (#107779)

    I hate it when the press makes it sound like America is the jack-ass backwards donkey of the supercomputing world. This writer implies the Japanese and Europeans have vastly superior computing power. This is clearly the notion of a chucklehead. Take a look at The Top 500 [top500.org]. By its (Linpack) metric, 8 of the top 10 machines are in America. Three of them are DEDICATED to weather or environmental work (Naval Oceanographic Office, National Centers for Environmental Prediction). A fourth one at NERSC is relatively open, compared to defense machines, and I'd be willing to bet weather code is running on it regularly. These are all teraflops machines. Japan has the other two in the top 10. Anybody know the job mix on those two? Europe's fastest machine is the Hitachi in Muenchen. The fastest dedicated European weather machiens are the T3Es at the Deutscher Wetterdienst and at the UK Meteorological Office.

    I don't buy these whiny weathermen's complaints. The difference is that the American machines are all massively parallel machines (mostly IBM SP). The Japanese manufacturers all make vector machines, some of which the Europeans use. The Cray T3E is kind of a weird in-between architecture. It takes a good programmer to use a MPP to its full capability. The vector users, on the other hand, have 30 years of old code and practice which keeps them in the game. If the Americans would suck it up and learn to use their amazingly fast IBMs we would hear whining from the other side of both ponds. If you try to run your old code for the Cray C90 on an IBM SP, you are going to get terrible performance. If you rewrite the code, you may get great performance. But these guys aren't rewriting the code. Take for example the machines at NCEP [noaa.gov]. These create the daily production weather models used all over the US. They are IBMs which replaced a Cray that self-immolated about 1.5 years ago. When they brought the new machines up, I wonder if they rewrote the code beyond making it run? If you know, enlighten us!

  • by bpowell423 ( 208542 ) on Thursday July 05, 2001 @04:42AM (#107780)
    Nobody had the technology in 1950, and I doubt we do now, to measure the temperature around the globe and get a global, annual average accurate to a tenth of a degree. Forget it.

    One other thing that bugs the crap out of me about "global warming". NOBODY EVER TALKS ABOUT CONCRETE, STEEL, AND ASPHALT!!! Hasn't ANYBODY ever noticed how hot a street or roof gets in the sun?!? I expect a few temperature measurements in growing cities would be more than enough to throw off their temperature measurements.

    Then, there's the well-ignored fact that we're coming out of a mini ice-age, which peaked circa 1850. Greenland was green when it got its name, folks. The earth got colder since then and is warming back up, completely without our assistance.

    And another thing... I saw just the other day that one of NASA's earth-monitoring has recorded a 30% increase in the levels of planktin in the oceans over the last 10 years. That's not a prediction, folks, that's a direct measurement. Concidering that planktin, not rain-forests as the greenies would like you to think, fix something like 70-80% of the CO2 in the atmosphere, it would appear that the earth is more than capable of absorbing whatever increase in CO2 we're providing.

    Really, these global warming people sound about as rediculous as the Y2K people. The sky is falling! The sky is falling! Buy my book! I just lost a fortune in tech stocks and I need money!
  • by Troodon ( 213660 ) on Wednesday July 04, 2001 @06:43PM (#107781) Homepage
    If you're interested in lending a hand to such research into climate change, some folks at the Rutherford Appleton Laboratory [rl.ac.uk] would appreciate your help with their Casino-21 [climateprediction.com] distributed client. Its still in the preparatory stages (ie client comming soon [rl.ac.uk]), and requires a significant [rl.ac.uk] investment in terms of commitment as compared to such things as SETI@home: "Casino-21 client will most likely require at least 128MB of memory, and 500MB of free disk space".
  • by Somnus ( 46089 ) on Wednesday July 04, 2001 @07:14PM (#107782)

    The article notes the objection of global warming skeptics as if there is scientific consensus that a) the build-up of so-called "greenhouses gases" causes the Greenhouse Effect (probably true) and b) that an increase in the concentration of greenhouse gases is anthropogenic (probably false):

    So even as the evidence grows that earth's climate is warming and that people are responsible for at least part of the change, the toughness of the modeling problem is often cited by those who oppose international action to cut the emissions of heat-trapping gases.

    Yes, the Earth is warming in some areas, e.g. Siberia. But, this is totally expected if you look on a geological timescale, vis-a-vis the Ice Age cycle. The debate is centered on whether or not man or natural processes (cycles of flora and fauna, volcanoes) are driving the current trend. I have not seen any convincing evidence to support the existence of anthropogenic phenomenon, and plenty to support the existence of natural phenomenon.


    *** Proven iconoclast, aspiring epicurean ***

  • by yellowstone ( 62484 ) on Wednesday July 04, 2001 @09:06PM (#107783) Homepage Journal
    <wag-about-the-future>
    Within the next, say, 100 years (prolly a lot sooner than that), we'll have the ability to release millions, even billions of nano-probes into the atmosphere and oceans (c.f. Stephenson's The Diamond Age). The air-borne probes can measure temperature, windspeed, and humidity. The water-borne probes can measure water temperature, currents, evaporation.

    Now imagine all these probes sending their observations back (in real time, perhaps using each other as repeaters to carry the signal) to a centralized data storage and analysis facility.

    Now imagine a massively parallel computer running simulations based on these observations... As another poster observed, there are bound to be limitations on any system that doesn't have perfect observations at infinitely fine granularity. Whatever those limitations are, I suspect we are not too far from finding out what they are.
    </wag-about-the-future>

    (for those who are wondering, "wag" is a technical term used in estimating -- it stands for Wild-@$$ Guess)


    --

  • by Rosco P. Coltrane ( 209368 ) on Wednesday July 04, 2001 @06:07PM (#107784)
    These crazy scientists are going to modify the weather pattern on Earth : as they progress in their weather simulations, they'll need more and more supercomputers, which in turn will run so hot they'll raise the temperature world-wide, which will make it harder for the scientists to simulate the weather, so they'll need more supercomputers ...etc... ARGHH, SOMEBODY STOP THEM !
  • by Anonymous Coward on Wednesday July 04, 2001 @07:48PM (#107785)
    Ok, I don't have a slashdot account, so probably nobody will read this, but you should know a few things about the top 500 list. The Top 500 list is not a measure of how fast a machine is at running weather codes. It is a measure of how fast a machine runs a benchmark called ``Linpack''. If you're trying to use this list to compare the capabilities of various computer with respect to predicting climate and weather changes, you will be mislead. The main complaint of U.S. climate scientists was that they did not have access to a decent Shared-Memory Vector computer like NEC's SX-5. That may be changing soon. The reason they want these is that they deliver real performance on real performance on real applications. The Top 500 list is a better indicator of theoretical performance. What you actually get varies widely depending on the types of tasks you ask the computer to do. Computer's like IBM's SP are good at running Linpack benchmarks, but they are less capable at running climate simulations. It is partly a matter of programming, but it is also a limitation of the SP's high interconnect latency and low interconnect bandwidth. I wouldn't call the T3E weird. It's basically the same as the SP (hundreds of commodity CPUs, each with their own memory, connected by a network) however, it's interconnect network is more sophisticated. Anyway, I can't explain everything in a little message but I hope you at least understand that Top 500 is not the whole story!
  • by zavyman ( 32136 ) on Wednesday July 04, 2001 @08:33PM (#107786)

    If the Americans would suck it up and learn to use their amazingly fast IBMs we would hear whining from the other side of both ponds.

    Great, what the hell do you think we are doing over at Argonne National Labs [anl.gov]? I mean, have you tried to paralellize an atmospheric climate modeler? I don't think so. Coding for a vector-based machine is pretty straight forward. You concentrate on once machine as if it had one processor and one bank of memory, and code away, occasionally noting if your loops are not easily vectorized. The compiler magically does the rest, and your program runs really fast. That's why the Japanese machines are nice.

    On the other hand, imagine designing an atmospheric climate modeler on a large cluster. The current paradigm being used and developed is MPI [anl.gov] Let's see what you have to worry about. One, since the processors are not sharing memory, messages have to be passed between them to share memory. No biggie. But now consider that the whole atmosphere has to be broken up into pieces of a grid. On the boundaries, grid points must be shared by two or more processors. At each timestep, those points must be synchronized. Code must exist to know the processors that border one another and that know which points to share.

    Now what happens if you want to combine different models together, as in the atmosphere, ocean, land, and sea-ice models. This is known as a climate coupler. Well, now you have differing grids for each of the models because they were developed independently. Now your program must handle interpolation of the grid points and must again know which processors border one another so that data is efficiently transfered. Finally, there must be decent load handling so that each processor is doing its fair share of the work.

    I'm now working on the climate coupler project here at Argonne. Vector machines are quite easy to program, but we do know that they will lose out to massively parallelized clusters. It's just that the programming is much more difficult, since the messaging is always the bottle neck. Communications development follows a rate similar Moore's Law, but with a longer doubling time. For maximum efficiency, the programmer must handle the messaging model directly.

    The modeling group [anl.gov] here at Argonne understands the issue, and we are working on a general climate system to run quickly on parallized machines. No you know why you can't just do a simple code rewrite. You need to redesign the whole system.

    Accelerated Climate Prediction Initiative. [anl.gov]

  • by blakestah ( 91866 ) <blakestah@gmail.com> on Wednesday July 04, 2001 @08:35PM (#107787) Homepage
    The debate is centered on whether or not man or natural processes (cycles of flora and fauna, volcanoes) are driving the current trend. I have not seen any convincing evidence to support the existence of anthropogenic phenomenon, and plenty to support the existence of natural phenomenon.

    There have been about a dozen articles published in Science in the last year in which model after model of climate has been tested. Time after time the models have converged on one and only one solution: increases in greenhouse gases are responsible, and the increases parallel those produced by man. The jury is in. The decision is done. The only issue left is whether mankind can do anything about it, and whether we can live with it.

    Seriously, see the EPAs opinion http://www.epa.gov/globalwarming/climate/index.htm l

    Also see the scientists commissioned by the UN to look into the problem - they also concur you are wrong

    http://www.ipcc.ch/

    TO paraphrase as at http://www.uic.com.au/nip24.htm

    * Over the 20th century the global average surface temperature has increased by about 0.6 degrees C, more than earlier estimated to 1994. This appears to be the largest increase in any of the last ten centuries.

    * Globally it is likely that the 1990s was the warmest decade and 1998 the warmest year recorded (since 1861). Certainly this seems to be the case in the northern hemisphere not simply since 1861 but for the last ten centuries.

    * On average, between 1950 and 1993, night time daily minimum air temperatures over land increased by about 0.2 degrees C per decade, lengthening the freeze-free period in many mid to high latitudes.

    * Since the 1950s the lower part of the atmosphere has warmed at about 0.1 degrees C per decade, as snow and ice cover have decreased in extent by about 10%, and Arctic sea ice thickness more than this.

    * However, some important aspects of climate appear not to have changed, including storm frequency and intensity and the extent of Antarctic sea ice.

  • by s20451 ( 410424 ) on Wednesday July 04, 2001 @06:10PM (#107788) Journal

    as it applies to modeling the weather --- that is to say, modeling the planet

    The article's a little misleading. It starts with a discussion of the weather, then moves on to discuss modelling of the climate. It's basically impossible to predict the weather -- meaning the exact temperature, rainfall, cloud cover, etc. -- more than a week in advance, because you have to specify the model with essentially infinite precision or chaotic effects take over. In fact weather prediction was one of the earliest manifestations of chaos theory. The climate -- meaning long term averages -- can exhibit stable behaviour that is possible to model in the long term. Don't look for this technology to dramatically improve weather prediction.

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...