As the world grapples with rising natural catastrophes, a combination of old forecasting methods and new prediction tools is helping experts prepare better and faster for events including floods, earthquakes, and hurricanes. Weather prediction tools, which warn governments and industry specialists about the likelihood of incoming disasters, have relied for years on collating and analysing large data sets through supercomputers, a task that new tools based on Generative AI are learning to complete within minutes and with enhanced accuracy.
Last year saw a sharp increase in extreme weather events, which experts classified among the main global risks for the fourth year in a row. This resulted from temperatures at about 1.55% above pre-industrial levels, which made 2024 the hottest year on record, according to the World Meteorological Organisation (WMO).
Global disasters, which threaten societies, economies, and ecosystems, are becoming “more frequent and more intense,” according to the National Aeronautics and Space Administration (NASA), which lists eight types of them: heavy precipitation, floods, high-tide flooding, extreme heat, marine heat waves, wildfires, droughts, and tropical cyclones.
The damages are piling up.
The International Chamber of Commerce (ICC) says that in the last 10-year period, economic losses due to unpredictable weather amounted to $2 trillion, adjusted to 2023 prices. 2022 and 2023 alone saw $451 billion worldwide in economic damages, with a 19% rise from yearly averages in the decade’s first eight years.
Reinsurance research firm AON says that the damage from global natural disasters in 2024 reached “at least” $368 billion, of which 60% was not covered by insurance.
In the US, where seasonal hurricanes can cause extensive localised damages leading to billions of dollars in claims from a single weather event, the National Centres for Environmental Information, citing information from the Billion-Dollar Weather and Climate Disasters, say it registered 27 confirmed climate disasters in 2024, with losses exceeding $1 billion each.
To save lives and money, industry experts have called for better preparation.
In the United Kingdom, after the catastrophic floods of summer 2007, which caused insurable losses then reported at about £3.2 billion, the findings of the report of Sir Michael Pitt’s Review recommended a closer collaboration between the Met Office, the national meteorological service, and the Environment Agency, the agency responsible for issuing flood warnings. This led to the establishment of the Flood Forecasting Centre (FFC), a successful partnership between the Environment Agency, the Met Office, and Natural Resources Wales, a Welsh Government Sponsored Body.
Hydrologists and meteorologists work closely using a risk matrix approach to estimate rainfall amounts and their potential results and analyse the data obtained by comparing them with satellite images and weather observations to try to refine the results.
But the large quantity of data required to analyse to achieve those results has been one of the most challenging physics and mathematical problems to solve, and has only been partly offset by significant improvements in the last decades, mainly through the use of supercomputers.
Until recently, two models have dominated the market. In the United States, the Global Forecast System (GFS) of the National Centres for Environmental Protection (NCEP) is a forecast model that experts generally regard as one of the most reliable. It uses sets of data gathered by the NCEP from five main sources.
Information is gathered daily from radiosonde stations, using upper-air atmospheric soundings, with helium and hydrogen-filled latex balloons connected to a radiosonde by 80 feet of string. The balloons, reaching 30,000 meters high and going as far as 320 kilometres from their starting point, track temperature, air pressure, and humidity levels. Sky measurements also include measurements gathered by the planes that are part of the Aircraft Meteorological Data Relay system and sent to the WMO.
Simultaneously, the NWS collects readings from the Next Generation Weather Radar (NEXRAD), 160 high-resolution S-band Doppler weather radars it jointly operates with the Federal Aviation Administration (FAA), and the U.S. Air Force. Each radar releases a burst of energy every hour, helping to detect precipitation or wind in the sky.
On the ground, the Automated Surface Observing Systems (ASOS) record changes and monitor the crossing of certain weather thresholds thanks to a joint effort by the NWS, the Federal Aviation Administration (FAA), and the Department of Defence (DOD). Satellite images and data recorded by volunteers around the US on daily precipitation levels complete the NCEP’s dataset.
In Europe, similar tasks are performed by the European Centre for Medium-Range Weather Forecasts (ECMWF), an independent intergovernmental organisation and one of the world’s leaders in weather prediction. It operates from Italy, one of the largest supercomputer complexes in Europe, and using the world's largest archive of numerical weather prediction data, the centre’s integrated forecasting system (IFS), with techniques similar to the GFS, combines weather observations and recent forecasts to produce a “digital twin of the Earth.”
In 2023, the ECMWF improved its IFS model with its new ML-based Artificial Intelligence Forecasting System (AIFS).
The model aims to simplify or cancel entirely the step of data assimilation, a complex process combining weather observations and previous short-term forecasts to produce an initial state of the Earth. Forecasts would be based on “observations alone,” says Florence Rabier, Director-General of the ECMWF, avoiding issues associated with data assimilation: the potential errors in the observations and the previous forecast and the need for often expensive or impossible close monitoring between the observations. The tests, promising, only use physical quantities measured directly by meteorological observing systems.
The centre runs two of those models: the AIFS-CPRS-based training and the AIFS diffusion-based training, both running on an ‘encoder–processor–decoder’ architecture. Both are trained on 40 years of Copernicus (Europe’s earth observation program) data.
America’s GFS and Europe’s AIFS models have sometimes yielded different results because of the data and equations they use.
The development of new machine learning models, like Google DeepMind's AI-driven weather prediction tool GenCast, may change that.
A revolutionary AI-based probabilistic ensemble forecasting system, GenCast, designed by Google DeepMind in the UK, promises to predict possible weather events and their likelihood in just eight minutes, as opposed to several hours.
It performs 20% better in tests than the world leader. Instead of providing a single best weather forecast like its predecessors, this ensemble model, which is based on Google DeepMind's earlier AI weather models, uses machine learning to provide over 50 weather forecasts.
While GenCast cannot predict rare weather events as it was trained on data from the ECMWF and ranging from 1979 to 2018, it is a game changer in terms of predicting extreme weather events.
AI will without a doubt dramatically improve weather prediction accuracy in the future, enabling better planning for extreme weather events, saving lives in millions, and consequentially, financial and personal losses.
But as AI itself requires energy-intensive computing power, some scientists doubt the planet still has the necessary resources to keep using this technology for long.