Future Tech

The secret to better weather forecasts may be a dash of AI

Tan KW
Publish date: Sat, 27 Jul 2024, 10:29 PM
Tan KW
0 459,726
Future Tech

Climate and weather modeling has long been a staple of high-performance computing, but as meteorologists look to improve the speed and resolution of forecasts, machine learning is increasingly finding its way into the mix.

In a paper published in the journal Nature this week, a team from Google and the European Centre for Medium-Range Weather Forecasts (ECMWF) detailed a novel approach that uses machine learning to overcome limitations in existing climate models and try to generate forecasts faster and more accurately than existing methods.

Dubbed NeuralGCM, the model was developed using historical weather data gathered by ECMWF, and uses neural networks to augment more traditional HPC-style physics simulations.

As Stephan Hoyer, one of the crew behind NeuralGCM wrote in a recent report, most climate models today make predictions by dividing up the globe into cubes 50-100 kilometers on each side and then simulating how air and moisture move within them based on known laws of physics.

NeuralGCM works in a similar fashion, but the added machine learning is used to track climate processes that aren't necessarily as well understood or which take place at smaller scales.

"Many important climate processes, including clouds and precision, vary over much smaller scales (millimeters to kilometers) than the cube dimensions used in current models and therefore cannot be calculated based on physics," Hoyer wrote.

Traditionally these smaller-scale phenomena have been tracked using a series of simpler secondary models, called parameterizations, Hoyer explained. He noted that adding to the problem is "these simplified approximations inherently limit the accuracy of physics-based climate models."

In other words, those parameterizations aren't always the most reliable and can degrade the overall accuracy of the model.

NeuralGCM works by swapping these parameterizations for a neural network. Three models were trained on existing weather data gathered by ECMWF between 1979 and 2019 at 0.7, 1.4, and 2.8 degrees of resolution.

According to the paper the results are very promising. Using Google's WeatherBench2 framework, the team say NeuralGCM was able to achieve parity with existing state of the art forecast models up to five days at a resolution of 0.7 degrees, while, at 1.4 degrees of resolution, forecasts out for five to 15 days were even more accurate.

Meanwhile, at 2.8 degrees the team found the model was able to predict the average global temperature between 1980 and 2020 with an average error rate, one third of that of existing atmospheric-only models.

NeuralGCM also proved quite competitive against more targeted models like X-SHiELD, which Hoyer explained offers much higher resolution at the cost of being more computationally intensive.

Against X-SHiELD, the boffins found NeuralGCM's 1.4 degree model was able to predict humidity and temperatures from 2020 with 15-20 percent fewer errors. During that same test they were able to forecast tropical cyclone patterns that matched the number and intensity of those observed that year.

Accelerating forecasting

The team didn't just replace these parameterizations with neural networks. The entirety of NeuralGCM was written in Google JAX, a machine learning framework for transforming numerical functions for use in Python.

According to Hoyer, moving to JAX had a number of benefits, including greater numerical stability during training and the ability to run the model on TPUs or GPUs. For contrast, weather models have traditionally run on CPUs, though GPUs are being used more and more - we'll touch on that a bit later.

Because NeuralGCM runs natively on accelerators, Google claims its system is orders of magnitude faster and cheaper to run.

"Our 1.4 degree model is more than 3,500-times faster than X-SHiELD, meaning if researchers simulated the atmosphere for a year with X-SHiELD, it would take them 20 days compared to just eight minutes with NeuralGCM," Hoyer wrote.

What's more, Hoyer claims that simulation can be done on a single TPU as opposed to the 13,000 CPUs necessary to run X-SHiELD, and you could even run NeuralGCM on a laptop if you want.

While promising, it's important to note that NeuralGCM is only a starting point, with Hoyer freely admitting that it isn't a full climate model. However, that does appear to be the long-term goal.

"We hope to eventually include other aspects of Earth's climate system, such as oceans and the carbon cycle, into the model. By doing so, we'll allow NeuralGCM to make predictions on longer timescales, going beyond predicting weather over days and weeks to making forecasts on climate timescales," Hoyer wrote.

To support these efforts, the model's source code and weights have been released to the public on GitHub under a non-commercial license. Amateur meteorologists can have some fun with it.

ML gains momentum in climate modeling

This isn't the first time we've seen machine learning used in climate modeling. Nvidia's Earth-2 climate model is another example of how AI and HPC can be combined to not only improve the accuracy of forecasts, but accelerate them as well.

Announced at GTC this spring, Earth-2 is essentially a massive digital twin designed to use a combination of HPC and AI models to generate high-res simulations down to two kilometers of resolution.

This is possible in part thanks to a model called CorrDiff, a diffusion model which Nvidia says is able to generate images of weather patterns at 12.5x the resolution, and as much as 1,000x faster than other numerical models. The result is a model that's fast and accurate enough to incite the interest of Taiwan, which is eyeing the platform to improve its typhoon forecasts.

Meanwhile, more climate research centers have begun adopting GPU accelerated systems. Climate research is one of several areas of study targeted for the 200 petaFLOP (FP64) Isambard-AI system being deployed at the University of Bristol.

Earlier this year, the Euro-Mediterranean Center on Climate Change in Lecce, Italy, tapped Lenovo for its new Cassandra super which will be powered by Intel's Xeon Max CPUs and a small complement of Nvidia H100 GPUs, which the lab aims to put to use running a variety of AI-based climate sims. ®

 

https://www.theregister.com//2024/07/27/google_ai_weather/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment