Although there are a lot of different sources from which you can get a weather forecast, those forecasts all come from one of a few sources: national weather services that run large, numerical weather prediction models on their computer systems. Two of the major suppliers are the US National Weather Service’s [NWS] Global Forecasting System [GFS] (the source for most US forecasts), and the European Centre for Medium-Range Weather Forecasts [ECMWF], located in Reading, England. Over the last few years, there has been a growing feeling that the US effort was not keeping up with the progress being made at ECMWF. The criticism became considerably more pointed in the aftermath of last year’s Hurricane Sandy. Initial forecasts from the GFS projected that the storm would head away from the US East Coast into the open Atlantic. The ECMWF models correctly predicted that Sandy would make a left turn, and strike the coast in the New Jersey / New York region.
According to a story in Monday’s Washington Post, and a post on the paper’s “Capital Weather Gang” blog, at least one good thng will come out of this rather embarrassing forecasting error. It’s anticipated that the NWS will get additional appropriated funds to allow the computers and the models they run to be updated.
Congress has approved large parts of NOAA’s spending plan under the Disaster Relief Appropriations Act of 2013 that will direct $23.7 million (or $25 million before sequestration), a “Sandy supplemental,” to the NWS for forecasting equipment and computer infrastructure.
This should go a long way toward addressing one of the most pressing needs for the GFS: more computing horsepower.
Computer power is vital to modern weather forecasting, most of which is done using mathematical models of the Earth’s climatic systems. These models various weather features, such as winds, heat transfer, solar radiation, and relative humidity, using a system of partial differential equations. (A fundamental set of these is called the primitive equations.) The equations typically describe functions that are very far from linear; also, except for a few special cases, the equations do not have analytic solutions, but must be solved by numerical methods.
The standard techniques for numerical solution of equations of this type involves approximating the differential equations with difference equations on a grid of points. This is somewhat analogous to approximating a curve by using a number of line segments; as we increase the number of segments and decrease their length, the approximation gets closer to the true value. Similarly, in weather models, increasing the resolution of the grid (that is, decreasing the distance between points) allows better modeling of smaller-scale phenomena. But increasing the resolution means that correspondingly more data must be processed and more sets of equations solved, all of which takes computer power. Numerical weather prediction , although it had been worked on for some years, really only began to be practical in the 1950s, with the advent of digital computers, and the early weather models had to incorporate sizable simplifications to be at all practical. (It is not too useful to have a forecasting model, no matter how accurate, that requires more than 24 hours to produce a forecast for tomorrow.)
The computation problem is made worse by the problems inherent in data acquisition. For this type of numerical analysis, the three-dimensional grid would ideally consist of evenly spaced points, covering the surface of the Earth and extending upwards into the atmosphere. Clearly, this ideal is unlikely to be achieved in practice; getting observations from the center of Antarctica, or the mid-Pacific Ocean, is not terribly convenient. There are also ordinary measurement errors to deal with, of course. This means that a good deal of data pre-processing and massaging is requied, in addition to running the model itself, adding even more to the computing resources needed.
Many observers point to the GFS’s limited computer power as one of the chief weaknesses in the US effort. (For example, see this blog post by Cliff Mass, Professor of Atmospheric Sciences at the University of Washington, or this post by Richard Rood, Professor at the University of Minnesota in the Department of Atmospheric, Oceanic and Space Sciences.) The processing speed of the current GFS system is rated at 213 teraflops (1 teraflop = 1 × 10¹² floating point operations per second); the current ECMWF system is rated at 754 teraflops (and is listed as number 38 in the most recent Top 500 supercomputer list, released in November 2012 — the GFS system does not make the top 100).
The projected improvements to the GFS system will raise its capacity to approximately 2600 teraflops; in terms of the most recent Top 500 list, that would put it between 8th and 9th places. (Over the same period, the ECMWF system is projected to speed up to about 2200 teraflops.) This will enable the resolution of the GFS to be increased.
The NWS projects the Sandy supplemental funds will help enhance the horizontal resolution of the GFS model by around a factor of 3 by FY2015, enough to rival the ECMWF.
There are also plans to make other improvements in the model’s physics, and in its associated data acquisition and processing systems.
These improvements are worth having. The projected $25 million cost is a very small percentage of the total Federal budget (about $3.6 trillion for fiscal 2012). As we are reminded all too often, extreme weather events can come with a very large price tag, especially when they are unexpected. Better forecasts have the potential to save money and lives.
Like this:
Like Loading...