Models Part 1
Those Pagosa Weather folks are always talking about weather models.
What are a weather models?
Weather models are a mathematically derived look into what the future weather conditions will be.
A little history
Before computers were available to do the computations, forecasters used linear rules of thumb to forecast. For example an area of low pressure will move southeast at XX% of 500 Millibar (mb) wind blah, blah…
“The first one-day, nonlinear weather prediction was made in April, 1950. Its completion required the round-the-clock services of the modelers… took more than 24 hours to execute.” “The first regional system concentrating on North America, called the Limited Fine Mesh (LFM) model, was implemented in 1971. The first global model became operational in 1974.”
We were still using the LFM when I joined the Air Force in 1987 but it was slowly being replaced by the Nested Grid Model or NGM. We received the two dimensional model over a giant fax machine, and would hand color and analyze them.
Now there are over 50 weather models!
There are regional, global, short range, medium range, and long range weather models. Some weather models are built for specific needs such as hurricanes.
As a forecaster, it can be a bit overwhelming. With the assistance of computers, analysis is done for us and we can view the output as a 3 dimensional image.
But how do they work?
First, real time data has to be ingested by a computer. Every hour, throughout the world, weather observations are taken manually and automatically. They are encoded and sent to weather data centers. They are taken at the top of the hour, 24 hours a day, 365 days a year. In addition, twice a day, weather balloons (yes, the ones that have been confused with alien aircraft for years) are sent up twice a day at 00 Zulu (z) and 12z throughout the world.
The issue is that there are big data gaps over low population areas and oceanic areas. Water buoys, aircraft nose sensors, wind arrays, etc are also added to the ingested data. In addition, we can derive data from weather satellite imagery – a rapidly evolving technology. Once all the data is ingested, the weather models build horizontal and vertical grids. Each of these grids or packets of atmosphere has an estimated temperature, moisture, wind, and pressure.
Then the computer performs literally trillions of computations to estimate how all these packets in the atmosphere will interact with each other, move and whether it will snow in Pagosa or not!
Why do we need so many different weather models and how is it that they can be so wrong?
To keep things simple, I’ll discuss the two weather models we refer to most often in our discussions.
The Global Forecast System (GFS) is a global numerical weather prediction system run by the United States’ National Weather Service.
The European Centre for Medium-Range Weather Forecasts (ECMWF) or EURO is an independent intergovernmental organization supported by most of the nations of Europe and is based at Shinfield Park, Reading, United Kingdom. The Euro uses four-dimensional variational data assimilation (4D-VAR). The GFS is “3D-VAR”. For the Euro,that extra dimension costs 10 TIMES as much to run the computations, but also tends to be more accurate. The extra cost to produce the Euro generally means you have to pay to access most of the data.
Every model uses different topography. The finer the model’s resolution, the better the result but it also takes longer to run the computations. The same is true for the overall resolution of the product. High resolution 1km and 2km weather models require a huge amount of computations and take more time to produce. For this reason they tend to be short-range weather models that go out to less than 4 days. It’s a primary reason we avoid forecasting specific storm conditions until we’re in that 4-day window.
I’ll use the GFS to give you an idea of how the model makes its computations.
The GFS is a “spectral” 13 kilometer (km) product for the first ten days. That means it has a data point every 8 miles and averages the weather conditions across the entire 8 mile grid (see picture 1). These points are in the horizontal as well as the vertical. But what happens if the coldest spot is between the data points? What if high winds funneling through a mountain pass are not ingested in the initial data? Then the algorithms the computer runs and will be off. The error then gets amplified over time. This is a case of “garbage in; garbage out”.
After 10 days, the GFS becomes a 27km/17 mile resolution product. To put this in perspective, this means the model assigns the same atmospheric conditions to the town of Pagosa as it does to Pagosa Peak! It’s one of the reasons we call long-range weather models “fantasy land”.
That’s where we come in – we are the “Forecaster in the Loop”. It is our responsibility to first initialize the weather models.
Is the initial time period accurate? How does it differ from actual conditions?
Then we need to verify that the weather models have been handling the current system accurately.
Below I’ve posted some examples of levels we look at in this verification process. The first is the 300mb, which tends to average around 30,000 feet. We look at 300mb to determine where the Polar Front Jet is located. Next is the 500mb chart, which is usually around 18,000 feet. This chart includes a vorticity analysis or spinning of the air. Another level we look at is 700mb or ~ 10,000 feet with a moisture overlay that is a good representation of mid-level moisture. Next we might look at the 850mb or 5,000 feet for temps and moisture. Now bear in mind, these are at mean sea level so the 850mb is just an estimate because it is below Pagosa! Our surface is somewhere between 850mb and 700mb.
In summary, the models did not verify well. Verification is on-going 24 hours process. Models are tweaked and updated regularly. That is where at Pagosa Weather, we rely on the team at National Weather Service to for verification of the model. Essentially, we are tailoring their forecast specifically for Pagosa Country.
We’ll archive the information from this system and other systems as we go forward into a Terminal Forecast Reference Notebook (TFRN) for Pagosa. This will help us in the future when we see similar patterns and the models are struggling. We can say, “Hey, remember the Christmas storm of 2019!”
Even better than models, the best weather forecasters are folks such as old ranchers who are always watching the sky, have lived in an area for many years, and have seen weather patterns repeated again and again. We look forward to living in Pagosa Country for many years and someday I’ll be able to trust my old knee when the snow is coming.
– Arleen Prochazka
The charts I’ve posted are for Christmas day, 2019. So on Christmas day we will “verify” how this particular model run verifies with this particular system. Finally, we look at the consistency of the model. If they show the same amount of snow on each run, for example, with the same temperature and winds at a distant point in time, day after day, it is consistent.
However, if I look at an event 10 days out and the model looks for snow, and then I look two days later it shows no snow for the same day and then two days later, it shows snow again, then that model is not being consistent and my confidence goes down. I have to investigate why it’s flip-flopping so much.
Another issue occurs with what we call the “warm” model and the “cold” model. Weather models run at 00z and 12z. Zulu is the time that it is in Greenwich, England because the prime meridian passes through there. It insures that forecasters across the planet are all on the same time. At 00z, it is 5pm in Pagosa (standard time). That is the warm part of the day as compared to 5am (12z). The data at 5am (or cold model) includes a lot of inversions or anomalous data. The results from the 00z and 12z models can look quite different.
Again, it is important to discern if the models are initializing and verifying properly. To do it well can be a time consuming process that occurs 24/7. That’s where a team of forecasters that do it around the clock are so important to the process – thank you National Weather Service!
Today there are a lot of smart phone apps and internet sites that provide weather data. Since I got into the weather business 32 years ago, the changes in weather data is mind boggling. The data frequency and accuracy have increased tremendously and that trend it likely to continue.
Want to read more about Weather Models? Check out this great blog discussion.
WEATHER MODELS Part II
As weather forecasters, it’s important after a major system moves through for us to look back and see how the models and how our forecast panned out or verified.
A week ago I posted an article on what weather models are and how we use them. With that post I attached some graphics that were valid for Christmas day.
This post is about verification of the model for the 5 day outlook and Christmas Day.Overall the forecast for Christmas day was weak. A week ago, in our five day outlook we said we said we would get 2-5” of snow. We actually got 2-12” in that time frame (plus an additional 2-4” overnight). In addition, if the temperature in the lowest layers had been cooler, in town locations would have received more snow.
First, look at the chart comparison of 300mb (30,000 feet). This is the level we look for jets. There is a jet max poking into southern California pushing into southwest Colorado with southwest flow.