Meteorological Methodology and You
The hot button topic so far in this first week of meteorological Spring has been the devastating tornadoes that have impacted the southeast. Meteorologists drew praise from their peers while criticism from the general public over the warning processes they deployed to try and convey the seriousness of the situation. It was not all praise, however, from the meteorological community. I observed some dialogue via Twitter where certain NWS meteorologists were questioning the Storm Prediction Center's use of the [for lack of a better term] "microscale" discussion. These discussions focused on individual supercell thunderstorms and extrapolated potential tornado strength. The criticism revolved around the SPC overstepping their boundaries and essentially issuing tornado warnings via these "microscale" discussions. The purpose of this excerpt isn't to discuss the pros and cons of that argument, but to look into a program I have been closely following that is in direct relation to these "microscale" discussions. Some examples of the MDs are shown below:
I have always been in severe weather forecasting. My motives originally were because I wanted to become a more successful storm chaser. Over the past eight years, the focus is squarely on improving lead time and awareness for the general population. To date, I have almost a quarter of a million subscribers and at any given time could have a couple of million social media interactions. Needless to say, being involved in any program to help further the warning lead time and what to convey to the general public is something that I take very seriously. Society today is very technological. The baby boomer generation now is as likely to have a smartphone as a seven-year-old. With the boon in technological innovation, the need to convey critical severe weather information needs to be in lockstep. The days of "it hit without warning" are coming to a close. The media will need to come up with another faux headline when disaster strikes.
The "Warn on Forecast" [WoF] and "Forecasting a Continuum of Environmental Threats" [FACETs] projects hosted by the National Severe Storms Laboratory are continually testing the boundaries of advanced severe weather prediction and are on the verge of something that could be a game changer on how the general public receives warnings. Imagine the opportunity to receive an alert on your smartphone/device telling you that baseball size hail was going to hit your location in 90 minutes AND have it be over 50% accurate. Now picture getting an alert saying your specific location has an 80% chance of being impacted by a tornado in the next 30-60 minutes. Examples of each are shown below.
We are a long way off from this becoming truly reality, but the minds at work and the social scientific innovation with group participation from WAS*IS are making that journey shorter by the day. The limitations as it stands are too great to ignore. An example from 2013 (shown below) of the El Reno tornado. This tornado was the largest on record and arguably had some of the strongest winds ever recording by a mobile radar. This was also the tornado that claimed the lives of Twistex's core contributors. Using current "obsolete" technology (WSR-88D) the progged track of the mesocyclone was displaced by tens of miles. If this data was communicated currently, the anticipated tornado track and subsequent warning would have as much of false alarm ratio as our current warnings. The new innovation of radar "multi-faceted phased array radar" (MPAR) will be able to do several functions not currently available to the WSR-88D. Of greatest benefit is the capability to get data within a minute versus the three to five minutes currently available. Some of the experiments performed by the Hazardous Weather Testbed (HWT) prove just how valuable this data is. Twelve forecasters were selected to recreate a warning scenario where low-end tornadoes occurred. Using WSR-88D data (update every 5 minutes) the average lead time for tornado warnings was 12 and a half minutes. Using the same scenario with the MPAR data, the lead time jumped to 21 minutes. This type of data IS available to the NWS now, but due to the extreme cost of installing this radar, only the NWS in Norman utilizes it.
Until obsolete radars are updated, these projects aren't close to reality. This is not something I believe the NSSL is rushing to complete. Why incorporate something with data that will only make marginal improvements. The vast amount of data that will be disseminated to the local NWS offices before and during severe weather events will be too much for one forecaster to handle. For those that are familiar with the HRRR, RAP, and other hi-res models, the incoming innovative nature showing not only radar data but percentage based probabilities is going to really blow minds. From an outsider's look in, I am very envious at those that get to improve and work with these projects on a daily basis.
Until we get to where we need to with the proper upgrades, sufficient funding, and scientific data we're going to stuck with reality. Quite honestly the real world in severe weather watches and warnings aren't bad. The average lead time for tornado warnings is at 13 minutes. 13 minutes will give someone enough time to access, analyze, and act. The warning, however, is your last resort. For many, it is their first clue that bad weather is imminent. We need to get to a point in society where severe weather prone areas have a healthy respect for the power of Nature's Fury. The last event in Alabama once again showed that even the best forecasts from the Storm Prediction Center, National Weather Service, and local media aren't good enough to command the attention of the entirety of the residents in the path of these killer storms. Maybe the next wave of weather innovation such as WoF/FACETs will make it easy enough for anyone to understand and be able to take the proper action without questioning the validity.