Why is a prediction important

Numerical weather forecast

What will the weather be like tomorrow? Computers have been used to answer this eternal question for decades. The forecasting process is complex: measured data on wind, temperature and other variables around the world are included in the forecast. The extensive software of the weather forecast models is based on the fundamental laws of physics. This is how meteorologists use their supercomputers to calculate the weather for the next few days.

In principle, the prediction goes back to the laws of conservation of physics: The energy, the momentum and the mass of air and water remain constant in time in closed systems. These physical laws form the basis for the equations of the weather models. Mathematically speaking, these are partial differential equations in space and time. The equations not only calculate how the temperature, humidity, pressure, horizontal wind and vertical wind components change in the atmosphere. The top layers of the ground must also be taken into account - here it is sufficient to limit yourself to temperature and humidity.

Discretization

It would be very practical if the solutions to the model equations could be found analytically, that is, by converting the formulas so that an equation like “temperature on Thursday = temperature on Wednesday + air pressure squared times X” would result. But that's mathematically impossible. That is why meteorologists use a process called discretization: They “cut” the air envelope horizontally and vertically so that manageable individual parts are created. Using a mathematical grid, they define thousands of virtual boxes. What is calculated in the computer models is then the change over time in the average weather parameters in these boxes. The regional weather forecast model COSMO-DE of the German Meteorological Service, for example, has 50 vertical layers, the thickness of which increases from 10 meters on the ground to several kilometers in the stratosphere. The horizontal grid spacing is 2.8 kilometers. The time is also discretized: One calculation step corresponds to a time interval of 25 seconds. In 2016, the new model COSMO-D2 is to be switched to. This will reduce the horizontal grid spacing to 2.2 kilometers and increase the number of vertical layers to 65.

Parameterization

Grid of the ICON weather model

However, not all weather processes can be realistically represented in a computer model. Problems are caused by those processes that take place in the order of magnitude of the grid spacing or are even smaller - that is, they fall through the mesh of the grid. A thundercloud, for example, may appear huge in the sky. However, it is usually too small for the weather model. For this reason, the experts “parameterize” such processes: What they cannot calculate directly, they express with the help of the other variables. This is a science in itself that requires a lot of computing time to forecast. In addition to convective clouds such as thunderclouds, layer clouds, turbulence and short-wave and long-wave radiation processes are also parameterized. Not to be forgotten are the heat and moisture flows on the ground, which also require a separate model for the soil layers.

assimilation

In order to calculate the development of the weather, the computer models must of course start from some initial state. For this you need the measured weather data. They are recorded at the weather stations of the national meteorological services, but also by satellites, with radars, by buoys, weather balloons and by aircraft during take-off and landing. Thousands of stations around the world are involved in regular weather measurements. However, you cannot just take the measured weather data and start calculating. Often some data is missing at some point. In addition, the measuring stations are not necessarily in the center of the virtual model boxes, but often also on the edge. And for the model calculations it is imperative that the weather data physically match each other exactly. Otherwise very strange, unrealistic things will happen in the computer model - virtual atmospheric vibrations will form and then it will rain in the wrong places.

In order to determine as correct an initial state as possible for the forecast calculation, meteorologists have therefore devised so-called assimilation methods. One starts with the last prediction of the computer model and tries to “draw on” the model for the measured data. This very tedious calculation requires a similar effort as the forecast itself. It is therefore certainly not surprising that the largest non-military computers in the world are used for weather and climate simulations. For example, the computer of the German Weather Service (DWD) has a theoretical maximum computing power of 560 teraflops (as of September 2015) - that is 560 trillion floating point operations per second. With the COSMO-DE model, the weather is calculated eight times a day 27 hours in advance. The prognosis of hurricane Lothar, which caused great damage in Germany on December 26, 1999, showed how important precise initial conditions can be. Because incorrect data had been transmitted two days earlier from a weather balloon near Nova Scotia (Canada), the severe development of the hurricane low could not initially be foreseen. Only later simulation runs clearly showed the danger posed by Lothar.

boundary conditions

In addition to the initial conditions, every weather model also needs so-called boundary conditions - these are not just the weather conditions at the edge of a detail model (such as the COSMO-DE), but also important, slowly changing variables such as the temperature on the sea surface or the sea ice cover. While the conditions at the edge are provided using global weather forecast models, the temperature and the ice cover of the sea can be measured with satellites. They are included in the calculation as constants.

postprocessing

Weather forecasts are becoming more accurate

When the computer's numerical forecast is available, the work of the meteorologists is far from over. Because then the postprocessing starts: Numerous weather maps and other visualizations are automatically created from the digital data. In order to fulfill its legal mandate of services of general interest, warning notices are developed from the forecasts by the DWD, for example for technical relief organizations, fire brigades and many authorities. It goes without saying that the media and private weather services obtain both data and basic forecast texts from the DWD. Farmers and foresters expect different types of specific predictive information.

Forecast quality

After all, the weather forecasts are checked every day - not only by the people in the country, but also by the DWD staff. The improvement in the forecasts is clearly visible: today, a forecast of the air temperature over three days is more reliable than a forecast for the next day 25 years ago. The most difficult thing, however, remains to predict rain and snow. But even here there is more and more progress. The finer the resolution of the numerical weather models, the easier it is to calculate the complicated processes that trigger a summer downpour.