Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Education Math News Science Technology Build Hardware

Supercomputers Help Researchers Improve Severe Hail Storm Forecasts (nsf.gov) 23

aarondubrow writes: Researchers working on the Severe Hail Analysis, Representation and Prediction (SHARP) project at the University of Oklahoma used the Stampede supercomputer to gain a better understanding of the conditions that cause severe hail to form, and to produce hail forecasts with far greater accuracy than those currently used operationally. The model the team used is six times more resolved that the National Weather Service's highest-resolution forecasts and applies machine learning algorithms to improve its predictions. The researchers will publish their results in an upcoming issue of the American Meteorological Society journal Weather and Forecasting.
This discussion has been archived. No new comments can be posted.

Supercomputers Help Researchers Improve Severe Hail Storm Forecasts

Comments Filter:
  • 'bout time.
  • by rmdingler ( 1955220 ) on Thursday March 24, 2016 @08:43PM (#51773569) Journal
    It's a great era to alive.

    A hurricane named Ike hit Galveston in 2009. With plenty of forewarning, 37 total deaths are generally accredited to the storms wrath. The usual number of folks decided not to leave despite advice to the contrary.

    Another storm hit Galveston in 1900 [wikipedia.org] with no warning. People were sitting on the beach when a wall of water hit that was taller than their houses.

    • Ike was pretty nuts. He sure did beat the shit outta Houston too. No power for months on account of some 200+ power lines being the only thing stopping the trees from falling on the streets. Then someone siphoned my gas.

      • Ike was pretty nuts. He sure did beat the shit outta Houston too. No power for months on account of some 200+ power lines being the only thing stopping the trees from falling on the streets. Then someone siphoned my gas.

        Yikes... I was pretty sure you were going to say, "Ike was pretty nuts. He beat did the shit outta Tina, too.

      • Re: (Score:2, Informative)

        by Rei ( 128717 )

        And the thing is, Ike could have been so much worse for Galveston. If landfall had been ~10 kilometers to the southwest (which in hurricane terms is just a wobble), he would have done to Galveston what he did to the Bolivar Peninsula [google.com].

        I was so mad with the mayor of Galveston, constantly playing down the building storm until the last minute out of fear of driving away tourist money. The NHC was taking the storm very seriously and giving warnings about its size, its growth potential and the potential range o

  • Deliver to their Senator Inhofe so he can throw it on the Senate floor.
  • by friedmud ( 512466 ) on Thursday March 24, 2016 @10:13PM (#51773915)

    Good story to read while I'm waiting on my current job on 9600 procs on my local supercomputer :-)

    Currently working on scalability of a new algorithm that will ultimately get applied to nuclear reactor simulation.

    As for the this story... being a graduate of University of Texas I do have to snicker a bit that Oklahoma has to use our supercomputer ;-)

    • Just went and read the article... looks like some great work! Very apt work too... my parent's (who live in the Dallas area) just shared a video of HUGE hail that hit them just tonight. Incredibly dangerous storms... so it's great to see that good, predictive simulation is being developed to forecast it!

      • by Anonymous Coward on Friday March 25, 2016 @01:21AM (#51774457)

        I'm a bit jealous of your supercomputers. I'm three states to the north of you, in the Big Ten Conference, and I run jobs on the supercomputers at my school. I'm limited to 2,000 cores on each of two clusters. And I fuss about the queue to no end. I'm a meteorologist who runs lots of numerical simulations as well as doing many other things.

        Part of warn-on-forecasting is also trying to get better data into our weather models. We have lots of data right at the surface, but not a whole lot above the ground. That's very important, especially in the first 1,500 meters or so. One of the things I'm working on is using drones to collect data around severe storms within the first 1,000 meters above the ground and using that data to improve forecasts.

        There are some big projects going on in the meteorological community, with one of our big goals being the ability to reliably issue warnings with 30-120 minutes of lead time (warn-on-forecast). Not directly related, but the field phase of PECAN (Plains Elevated Convection At Night) just wrapped up in July 2015. That's for the purpose of better understanding nocturnal storms that produce heavy rainfall and severe weather. There's also VORTEX-SE (Verification of the Origin of Rotation in Tornadoes Experiment - Southeast) based out of Huntsville, Alabama, which is going on right now. It's similar to VORTEX (1994/1995), VORTEX99 (smaller campaign in 1999) and VORTEX2 (2009/2010), but targeting storms in the Southeast, which frequently produce tornadoes but in very different environments compared with their counterparts in the Plains.

        • You're posting anonymously, but if you can figure out how to contact me I can set you up.

        • My advice: partner with one of the national labs. In the nuclear energy area we have NEUP proposals that give small research grants to universities... the other major thing it does is allows them to access the national lab's supercomputing facilities.

          I'm sure that the department of energy has programs related to weather forecasting that would allow you to get access to the nation lab's computing facilities. Feel free to message me and I'll see if I can help find some resources. The resources definitely e

  • by Anonymous Coward on Friday March 25, 2016 @12:48AM (#51774375)

    The article is fairly light on the details of what they're doing. It's also deceiving.

    Warn-on-forecasting is highly unlikely to result in severe thunderstorm warnings issued hours in advance. Right now, warnings are generally issued when any two of these three things occur: 1) the atmosphere is favorable for severe or tornadic storms; 2) radar indicates that severe weather is probably occurring; 3) storm spotter reports indicate that severe weather is occurring. If spotter reports are credible enough, warnings may be issued on the basis of those even if the other two conditions are marginal. Warnings are issued based on observations of something that's happening now. Warn-on-forecasting will involve issuing warnings because numerical models of a storm indicate there is a high probability a thunderstorm will become severe or tornadic in the future. These warnings will still be based on actual storms that have already formed and are likely to be for lead times of an hour or two.

    We already have predictions for when storms are likely to be severe hours or days in advance. The Storm Prediction Center issues convective outlooks as much as eight days in advance. They also issue severe thunderstorm and tornado watches, which are hours in advance. They indicate areas where there's a high probability of severe weather, but they're not forecasts for individual storms. They're more general in nature.

    What's really going on is that researchers have data mined the characteristics of a very large number of thunderstorms to determine what characteristics are the best indicators of hail size. They've developed another algorithm for identifying and tracking individual thunderstorms, so they can track those characteristics over time. It seems like the algorithm may be an improvement over prior storm identification and tracking algorithms, though a huge number of these algorithms exist. They're using the algorithm to track storms in both observations and forecasts that come out of numerical models. They've also improved the grid spacing of the models, going from 3 km horizontal (the resolution of the HRRR model) to 500 m horizontal. Presumably they've added more vertical levels. When you increase the resolution, you also need shorter time steps in the model, otherwise you'll get what are called CFL errors and the model won't run (or won't run properly). I believe dt for the HRRR is around 18 seconds, so you'd probably have to lower dt to 3 seconds (maybe lower) for a 500 m model. Also, instead of running the model once, they're running an ensemble, meaning that they have 50 or 100 different model simulations with slightly different initial conditions so they can determine the probabilities.

    What's missing from this are three things, though those researchers may be working on them anyway. One is that forecasts are only as good as the observations going into the model and its initial conditions. Getting good initial conditions is a challenge for thunderstorms because you need good observations of the storm structure (radar can help with this, but more observations are needed) and good data of the storm environment. We have lots of surface observations, but few observations above the surface. The first 1,500 meters or so above ground level are very important in how strong or tornadic storms are. We don't have a lot of observations above the surface in this area, but they're important. We also need better data assimilation techniques so those observations translate to better initial conditions for our models. Finally, we can improve models in ways other than data assimilation and improved resolution. We don't directly simulate hail in models; there simply isn't the computing power to simulate individual hydrometeors (e.g., hail stones, raindrops, ice crystals, and cloud droplets). We parameterize those using microphysics schemes in models. We've improved the microphysics schemes a lot over time, adding more classes of hydrometers and adding the ability to predict their size distribution and other data about them. Ge

If I want your opinion, I'll ask you to fill out the necessary form.

Working...