ACE: A fast, skillful learned global atmospheric model for climate prediction (Papers Track) Spotlight
Oliver Watt-Meyer (Allen Institute for AI); Gideon Dresdner (Allen Institute for AI Climate Science); Jeremy McGibbon (Allen Institute for AI); Spencer K Clark (Allen Institute for Artificial Intelligence); James Duncan (University of California, Berkeley); Brian Henn (Allen Institute for AI); Matthew Peters (AI2); Noah D Brenowitz (NVIDIA); Karthik Kashinath (NVIDIA); Mike Pritchard (NVIDIA); Boris Bonev (NVIDIA); Christopher Bretherton (Allen Institute for AI)
Abstract
Existing ML-based atmospheric models are not suitable for climate prediction, which requires long-term stability and physical consistency. We present ACE (AI2 Climate Emulator), a 200M-parameter, autoregressive machine learning emulator of an existing comprehensive 100-km resolution global atmospheric model. The formulation of ACE allows evaluation of physical laws such as the conservation of mass and moisture. The emulator is stable for 100 years, nearly conserves column moisture without explicit constraints and faithfully reproduces the reference model's climate, outperforming a challenging baseline on over 90% of tracked variables. ACE requires nearly 100x less wall clock time and is 100x more energy efficient than the reference model using typically available resources. Without fine-tuning, ACE can stably generalize to a previously unseen historical sea surface temperature dataset.