Exploring Randomly Wired Neural Networks for Climate Model Emulation (Papers Track)
William J Yik (Harvey Mudd College); Sam J Silva (The University of Southern California); Andrew Geiss (Pacific Northwest National Laboratory); Duncan Watson-Parris (University of Oxford)
Abstract
Exploring the climate impacts of various anthropogenic emissions scenarios is key to making informed decisions for climate change mitigation and adaptation. State-of-the-art Earth system models can provide detailed insight into these impacts, but have a large associated computational cost on a per-scenario basis. This large computational burden has driven recent interest in developing cheap machine learning models for the task of climate model emulation. In this manuscript, we explore the efficacy of randomly wired neural networks for this task. We describe how they can be constructed and compare them to their standard feedforward counterparts using the ClimateBench dataset. Specifically, we replace the dense layers in multilayer perceptrons, convolutional neural networks, and convolutional long short-term memory networks with randomly wired ones and assess the impact on model performance for models with 1 million and 10 million parameters. We find average performance improvements of 4.2% across model complexities and prediction tasks, with substantial performance improvements of up to 16.4% in some cases. Furthermore, we find no significant difference in prediction speed between networks with standard feedforward dense layers and those with randomly wired layers. These findings indicate that randomly wired neural networks may be suitable direct replacements for traditional dense layers in many standard models.