The new AI race: new models revolutionize weather forecasting

Accurate weather forecasting as far in advance as possible is not only essential for determining the availability of renewable energy, planning routes, or effectively managing agriculture or infrastructure investments. Lives are also at stake. Floods have affected 2.5 billion people in two decades, killing a quarter of a million of them and causing damage estimated at 936 billion, according to the Global Database on Natural Disasters (EM-DAT) at the Catholic University of Leuven (Brussels). Tech giants are competing to develop the best prediction model. Nature this Wednesday reports on the development of Aurora, the artificial intelligence (AI) model developed by Microsoft in collaboration with the Universities of Pennsylvania, Cambridge, and Amsterdam, among others. The goal is greater precision and effective anticipation, a race that also includes IBM and the ESA ( TerraMind ) and Google ( GraphCast ).
Predicting the weather is one of the most complex processes. Hundreds of erratic factors intervene in atmospheric evolution. Any alteration in any of them can ruin a forecast. For this reason, the most reliable predictions don't go beyond three days. But the investment in breaking this limitation is worth it: it would save lives and prevent economic losses that a study published in Nature Communications estimates at $143 billion annually (€128 billion).
Microsoft claims that Aurora enables more accurate and efficient high-resolution weather forecasting, as well as air quality, tropical cyclone tracks, and ocean wave dynamics. The program has been trained with more than a million hours of diverse Earth system data, and its model has been tuned to "outperform" several existing operating systems in speed and accuracy, according to Microsoft. The company claims its model has yielded "better results than state-of-the-art numerical models for 92% of objectives and improved performance in extreme events."
Traditional numerical models are based on decades of data, demanding significant computational power. Some factors take days to collect and process and require supercomputers and sophisticated equipment. This resource expenditure has recently been reduced by AI. In the case of Aurora, training, according to the authors of the research, took only eight weeks, compared to the years required by conventional systems. The researchers note that the Aurora model could also be the basis for analyzing other climate factors not involved in its current development.
Paris Perdikaris , an associate professor of mechanical engineering at the University of Pennsylvania and co-author of the research, says Aurora has been “a challenge” because it was not only about designing and developing reliable and accurate forecasting tools, but also because they were “accessible to everyone and required very few computational resources.”
“Aurora doesn't directly use physical principles, but rather relies on observations and information and learns from a very diverse set of geophysical data, including forecasts, observations, analyses, and reanalyses—essentially, a reconstruction of historical weather patterns,” Perdiakis explains.
To support the system's accuracy, the researcher highlights: "For the first time, we demonstrated that an AI system can outperform all operational hurricane prediction centers. Using historical data alone, [Aurora] was able to correctly forecast all hurricanes in 2023 with greater accuracy than operational centers." Perdikaris also highlights that the model demonstrates "the potential to accurately resolve storms and extreme events at very local scales."
TerraMind, from IBM and ESAOther giants are also in this race. IBM has published on the open repository ArXiv the results of TerraMind, a model developed by the European Space Agency (ESA) and the multinational corporation that, in a nutshell, consists of providing a brain to Earth observation satellite systems—the eyes of the Earth—and decades of information on the behavior of the atmosphere.
“We've started new models. In addition to radar data, we're adding other models, such as the differential vegetation index, which helps us understand life on the Earth's surface; elevation profiles to understand everything in three dimensions; geocoordinates… If we have a visual satellite image, our system can generate all the other models,” explains Juan Bernabé-Moreno, director of IBM's research division for Ireland and the United Kingdom and head of theAccelerated Discovery Strategy for Climate and Sustainability .
The system can not only show what's happening beneath the clouds that hinder the work of satellites. It's also capable of detecting, for example, ocean pollution, tracking fleets, the recovery of an area affected by a fire, invasive species in an ecosystem, the degradation of biodiversity or soil, the evolution of a phenomenon, or generating predictions from simulations with historical data. "You can apply it to anything that manifests itself in the atmosphere; it adds a level of understanding of the planet that wasn't available before," Bernabé-Moreno summarizes.
TerraMind was created using open source code and with very limited memory requirements (1.5 gigabytes) so that it is accessible to anyone without sophisticated equipment. "It's very important for us that the community adopts and uses it," the scientist argues. Future versions will be supported by artificial intelligence systems that allow users, such as ranchers who want to learn about their natural resources and their potential evolution, to interact with the program through dialogue.
The Spanish company Xoople is also working in this field of "the collection and analysis of terrestrial data to enable a systematic understanding of physical changes on the Earth's surface." It has just secured €115 million in funding thanks to the support of AXIS, the venture capital manager of the Official Credit Institute (ICO), and the CDTI, an entity of the Spanish Ministry of Science, Innovation and Universities, which has designated it as a Strategic Company. The goal is to apply AI to recognize patterns, detect changes, and provide predictive analytics on common platforms.
Google DeepMind, the artificial intelligence company of the North American technology giant, was the first to demonstrate in Science a machine learning-based weather forecasting model that provides 10-day forecasts that are “better, faster, and more accessible than existing approaches,” according to the study. The model, called GraphCast , outperformed traditional systems in 90% of the cases tested.
The system Google used as a reference was the European Centre for Medium-Range Weather Forecasts (ECMWF), which has a supercomputer in Bologna, Italy, with around one million processors and a capacity of 30 petaflops (30,000 trillion calculations per second). This center, which uses artificial intelligence in its Integrated Forecasting System (AIFS) and offers long-term forecasts of weather events, predicted the torrential rains in September in Central Europe .
GraphCast doesn't require these capabilities and uses machine learning trained on historical data to provide an accurate 10-day forecast in less than a minute. "We believe this marks a turning point in weather prediction," say the authors, led by DeepMind scientist Remi Lam.
EL PAÍS