What is the future of weather forecast


The weather forecast is not only an elementary part of our everyday life, it also has immense economic and ecological benefits for society: farmers use it to plan cultivation and harvest; and without a reliable forecast it will be difficult in Germany "to achieve a share of renewable energies in electricity consumption of 65 percent in 2030". [1] Weather forecasts also protect us from dangers. They warn us around the world of floods, fires, hurricanes and other extreme events with which we are confronted again and again and the effects of which became apparent in the summers of 2018 and 2019, when drought and heat waves also led to crop failures in Europe and cost numerous lives. [2]

The weather forecast is of course not a new invention. Many are familiar with old farmer rules such as "red sunset - good weather forecast, red dawn threatens to rain", a sometimes surprisingly precise short-term forecast, or the less precise long-term forecast: "If January is bright and white, summer likes to be hot". Today, modern technologies form the basis for a large number of differently visualized weather forecasts and warnings. The task of the meteorologist is (still) to provide information about the future weather situation in the target area and thus to facilitate decision-making. The basis of this meteorological interpretation is usually the numerical (computer-aided) weather forecast.

For numerical weather forecasts, equations from flow dynamics and thermodynamics are used to make statements about the future state of the Earth's liquid envelope. The starting point for the forecast is a snapshot of the (current) initial state of the envelope. The atmosphere, hydro and biosphere, sea basins and ice masses, the influence of the sun and many other factors are taken into account and statistically processed. On the basis of this assimilation of several million weather measurements from all over the world, the future state of the geosystem is calculated, whereby in addition to precipitation, temperatures, cloud formation or soil moisture, even the height of sea waves can be forecast. As is well known, however, forecasts are never absolutely reliable. The weather forecast is no exception. One of the reasons for this uncertainty is that we have an inadequate picture of the initial state of the geosystem. It is not possible to measure everything everywhere. Models are always only approximations to reality. Due to the chaotic nature of the atmosphere, even small miscalculations of the initial state can change the respective model results. The American mathematician and meteorologist Edward Lorenz put these uncertainties in a nutshell in the early 1970s with the famous metaphor that "the flapping of a butterfly's wings in Brazil (...) can trigger a tornado in Texas". For example, the forecast of extreme weather events with a lead of several weeks or months is extremely uncertain and characterized by a lack of knowledge about the processes in the atmosphere during these periods.

Modern weather forecasts, such as those produced by the European Center for Medium-Term Weather Forecasts (EZMW), therefore always simulate several future scenarios. These ensembles assume slightly different initial states of the atmosphere, follow their own logics and come to slightly different results. They enable meteorologists to capture uncertainties as early as the forecast approach. This information can be used for decisions that are economically more advantageous than decisions based on forecasts that are based only on a single future scenario.

"Silent revolution"

On average, this enables us to predict the weather three days in advance just as precisely as it was 30 years ago to predict the weather the next day. The key to this is, among other things, the explosive increase in satellite data, supplemented by ground-level measurements from numerous government and private observation networks. Researchers therefore refer to the development of weather forecasting as a "silent revolution". [3] The measurement data come from ground stations, ships, airplanes, weather balloons and radiosondes, which are under the patronage of the World Meteorological Organization (WMO) and which follow global standards set by the WMO. These are also central in the area of ​​telecommunication systems such as the Global Telecommunication System, with which measurement data is collected and distributed. Common standards ensure that all members of the WMO have access to all meteorological data, which enables information to be exchanged in real time.

This enables forecasts to be made for every location on our planet, which is particularly important for predicting tropical cyclones, as these have an enormous destructive power and regularly lead to the most severe floods and storm surges. At the end of August 2017, the tropical cyclone "Irma" formed near the Cape Verde Islands. It swept over Cuba and other Caribbean islands before reaching the southern tip of Florida (Illustration). Wind speeds of over 285 kilometers per hour were measured. [4] Attempting to predict Irma without satellite observations would only have indicated a very weak atmospheric disturbance moving westward across the Atlantic. Nothing would have indicated the development of a full-blown tropical cyclone, [5] because the measurement of increased convection activity, in this case the circulating current characteristic of tropical cyclones, requires humidity data from an altitude of around 3000 meters.

Course of the tropical cyclones Jose and Irma as well as satellite-based precipitation measurements from September 5 to 12, 2017 (& copy source: NASA Earth Observatory image / Joshua Stevens (own caption).)

The global measurements made it possible for the ECMWF to predict the cyclogenesis of Irma one week in advance, and using the corresponding model it was also possible to indicate which islands and coastal areas would be affected, which would have been impossible just a few decades ago. However, it was not possible to predict the strength of the cyclone correctly. This can be at least partially explained by the fact that the grid squares into which ECMWF meteorologists have divided the earth are not yet small enough to correctly predict certain properties of tropical cyclones.

The model resolution forms the basis for calculating the weather equations and has a major influence on the quality of the weather forecast. A distinction is made between vertical and horizontal resolution. The vertical resolution divides the atmosphere vertically into planes, the horizontal one indicates the size of the grid cell. A global forecasting model such as the numerical system of the ECMWF, for example, today works with up to 128 vertical levels, compared to 16 levels in the 1980s. The highest horizontal resolution for the ECMWF's forecasts is currently around 9 kilometers; in the 1990s it was 300 kilometers. The significant scientific and technical efforts to develop a higher-resolution model, for example, have significantly improved flood forecasts.

The topography also illustrates how important the step to a higher resolution is. This controls and blocks the atmospheric flow and increases the precipitation and exposure temperature, which is central to the strength of weather phenomena such as the foehn wind. With a grid size of 300 by 300 kilometers, the highest point in the Alps would be 906 meters, with a size of 9 by 9 kilometers it would be 3,063 meters.

In addition to the model resolution, improvements in the mathematical recording of physical processes also play an important role. The increase in the model resolution must go hand in hand with an improvement in the model physics in order to noticeably improve the prognoses for devastating events such as floods. Extremely high precipitation and flash floods are responsible for around 85 percent of floods worldwide, but flash floods in particular are limited to small areas and short periods of time, which makes forecasting particularly difficult. [6]

Flash floods are often created by a sequence of several thunderstorms that lead to deep, moist convection: Moist air rises in response to instabilities in the atmosphere. Since global models with the current grid size of 9 kilometers cannot simulate this small-scale convection, it is often mapped in a scheme that approximates the vertical mass transport in each grid cell on the basis of empirically derived parameters. Higher resolution models or models with convection components can do without this parameterization. [7] With these "nowcasts" the atmospheric convection can be mapped with sufficient accuracy and the precipitation and cloud cover can accordingly be better forecast, which leads to more reliable results when predicting individual storms and other small-scale events than in the case of numerical weather models. Since the publication of deviating forecasts by warning services could lead to irritation in the public, the German Weather Service (DWD), for example, has launched a pilot project with the integrated forecast system "Sinfony" which is supposed to bring the forecast models together. [8]

Machine learning (ML) offers another possibility to improve the precipitation forecast. Archive forecasts and error statistics are compared with the actual measurements. ML, which is now also used by many commercial providers, is in principle suitable for every weather feature, for example for forecasting temperature, wind speed or cloudiness. Most of the methods use forecasts from several centers as input, since the differences allow a better range of results. A certain forecast model may on average be superior to others, but the daily quality of the models is variable and also depends on the target area. For example, the United States National Ocean and Atmospheric Administration's ensemble method differs from the ECMWF method. ML processes can benefit from these differences or individual systematic errors by analyzing historical data sets and giving higher weighting to certain forecasts that perform better than others on average.

Many weather apps use ML for their forecasts, which are significantly better than raw model output data. As shown by the European flood warning system in October 2007 for the floods in Romania, [9] these methods are suitable for flood forecasting. With the Point Rainfall Project of the EZMW and the Model Output Statistics method of the DWD, there are two further ML-based models which, among other things, statistically combine different forecast models, ground and radar measurements and lightning strikes in order to predict flash floods. [10 ] ML would also be very useful in forecasting characteristics such as wind speed, wave height, water quality, and human health.

Future prospects

The quality of the numerical weather forecast will continue to improve. For example, the ECMWF has set itself the goal of reporting weather events such as extreme precipitation two weeks in advance (instead of currently one week) and large-scale weather phenomena or changes in the general weather situation, which are particularly important with a view to heat waves and droughts, four weeks in advance (instead of currently two weeks ) predict. At the global level, forecasts for large-scale anomalies such as La Niña or El Niño, which influence the weather around the world every three to seven years due to periodic changes in the sea surface temperature of the equatorial Pacific and are always accompanied by natural disasters, should be up to one year in advance ( instead of the current six to nine months). However, success depends on progress in several areas.

Modeling the earth
Older numerical weather forecasting systems focused largely on modeling the atmosphere. With this concentration on the atmosphere, the effects of aerosols, among other things, were neglected. These solid or liquid particles can - depending on their chemical composition - reflect and absorb solar radiation, for example, which in turn can change the distribution of heat in the atmosphere or influence photosynthesis. In addition, oceans and land areas were treated as almost constant quantities. However, the geosystem is more complex. The components - oceans, land surfaces, sea ice, atmosphere, biosphere and humans - interact in complex ways. It can already be shown how the direct coupling of the oceans to the atmospheric models can improve the predictions for mean time periods and for seasons. [11] A further improvement of the numerical prediction must be able to map the interactions of all components of the geosystem in a sufficiently complex manner. This requires increased efforts to better represent geophysics. In particular, the additional inclusion of a number of natural factors is necessary. For example, the collapse of icebergs, floodplain forests or the development of the coastal shelf, but also the effects of urban car traffic and the built environment on the atmosphere should be more closely integrated into the modeling of the earth. That requires billions of additional measurements for data assimilation.

Computer science and scalability
The further improvement of the weather forecast is closely linked to an increase in computing power. Every noticeable improvement in the forecast depends to a large extent on the availability of suitable high-performance computers and data processing systems. Since advances in computer technology seem to be slowing down, but at the same time the forecasting systems continue to become more complex, future forecast improvements are unlikely to be achieved in the current way. In particular, the energy requirement could become a problematic variable, as it will soon be impossible with the current computer architecture to make forecasts at reasonable costs. This makes new concepts for numerical weather forecast necessary. Further forecast improvements are expected less from the increase in individual processor performance, but rather from the massive use of special processors. This paradigm shift requires a complete rethink in numerical weather forecasting.

In some weather forecast models and other numerical systems, this step has already been taken and their computing power is increasingly based on graphics processors. Future exascale computer systems will increasingly depend on the interaction of different disciplines. Cooperation between weather model developers, computer scientists and hardware providers is therefore necessary so that the numerical weather forecast can benefit from the further advances in computer technology. [12] Quantum computing, which is still young, is one of the many exciting opportunities that will arise in the future.

Machine Learning and Artificial Intelligence (AI)
ML and AI have proven to be more efficient than humans in many areas. Their ability to extract knowledge from complex, extensive databases - the current ECMWF archive has a size of 466 petabytes, which corresponds to around 60 million films - makes them particularly suitable for numerical weather forecasting. In addition, such methods can play a role in the preparation of observations, the optimization of work processes and the modeling of complex physical processes and thus accelerate the calculations. The methods are already being used to model the reaction of the human body to atmospheric conditions [13] and to estimate soil moisture curves in data assimilation. [14] The processes and their practical application are still in their infancy, but are already showing great potential. At the same time, based on the spectacular rise of AI, the role of the meteorologist is questioned, which could make adjustments necessary, since many tasks that are carried out by hand by humans today will be carried out automatically in the future.

Internet of things
The numerical weather forecast is an extremely data-hungry subject. As a rule of thumb, the more data, the better the forecast - because the measurement data are decisive for determining the initial conditions of the geomodel. Typically, the measurements come from a limited number of sources, and relatively little measurement data is available, especially for on-site observations. That should change soon. Modern smartphones can measure air pressure and temperature, more and more weather enthusiasts have built their own inexpensive weather stations, modern cars can measure certain environmental data such as road icing and the amount of precipitation can be derived from the speed of windshield wipers. [15]

This data is gradually being incorporated into the weather forecast. For example, the Norwegian weather service recently started taking measurements from a public observation network and was thus able to significantly improve the forecasting quality for temperatures close to the ground. [16] The environmental data, which (can) increasingly be made available via the Internet of Things, form the basis for the next "silent revolution" in weather forecasting. It will make our everyday life even safer - regardless of whether it is about the decision for or against an umbrella, the use of wind energy or the prevention of flood disasters.