Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
NASA Space Supercomputing

NASA Upgrades Weather Research Supercomputer 71

Cowards Anonymous writes "NASA's Center for Computational Sciences is nearly tripling the performance of a supercomputer it uses to simulate Earth's climate and weather, and our planet's relationship with the Sun. NASA is deploying a 67-teraflop machine that takes advantage of IBM's iDataPlex servers, new rack-mount products originally developed to serve heavily trafficked social networking sites."
This discussion has been archived. No new comments can be posted.

NASA Upgrades Weather Research Supercomputer

Comments Filter:
  • Big Question: (Score:4, Interesting)

    by Penguinisto ( 415985 ) on Wednesday September 24, 2008 @07:57PM (#25145193) Journal

    ...what are they doing to improve the algorithms used to calculate the results? And if they're transparent (e.g. open for public inspection) - bonus!

    (yes, I know that there are only a few folks in the Human race that would even know how to read the things. That said, it would be nice to have something educational, and at the same time open for public scrutiny so as to avoid political accusation, you know?)

    /P

  • GPGPU (Score:3, Interesting)

    by sdemjanenko ( 1296903 ) on Wednesday September 24, 2008 @08:29PM (#25145497) Homepage
    Seeing as 67 teraflops is going to be the new processing power for this machine, I wonder if a NVIDIA CUDA implementation has been considered. Their Tesla systems are designed for this High Performance Computing, offer a significant amount of processing power and are relatively easy to parallelize code for. I know that oil companies use these high powered systems to find locations of oil, but I guess that its less likely for weather forcasting since there is less money in it. However, it would be interesting to see these cards used for modelling hurricanes and determine their expected strength and path of travel more accurately.
  • by EmbeddedJanitor ( 597831 ) on Wednesday September 24, 2008 @08:39PM (#25145591)
    These are the models predicting Global Warming etc. These need to be open to peer review due to the significant impact of getting these models wrong.

    Faster does not mean better. I'd rather have less iterations per day on a good model than many of a crap model.

  • by lysergic.acid ( 845423 ) on Wednesday September 24, 2008 @09:45PM (#25146085) Homepage

    perhaps NASA conducts so much peripheral research because there's no dedicated government agency for general scientific research.

    i know that we have the NOAA for atmospheric research, but perhaps there needs to be an overarching government agency for scientific research in general. NASA, NOAA, and probably NIST would be branches or departments under such an agency. and all research that is pertinent to our societal advancement, but does not have a dedicated agency such as NASA or NOAA, would be conducted under this umbrella agency.

    after all, we should be funding public research into general science, not just space/weather/nuclear energy. if we want to continue to be scientifically & technologically relevant, then we need a broader scientific research strategy, as well as a government agency to coordinate this strategy between the various existing research agencies.

  • by Junta ( 36770 ) on Wednesday September 24, 2008 @11:15PM (#25146795)

    When an xhpl score says '67 teraflops' and nVidia/AMD gpus spout off about the ludicrous number of gigaflops they have, it simply isn't the same.

    For example, the PS3 variant of the Cell processor claims 410 gigaflops. It's hpl score, however, would be about 6-9 gigaflops. Even the new cell processors 'only' get 200 gigaflops by xhpl count.

    32-bit precision scores aren't comparable directoly to 64-bit operations.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...