Extending Two Explainable Artificial Intelligence Methods for Deep Climate Emulators (Papers Track)
Wei Xu (Brookhaven National Laboratory); Rui Qiu (The Ohio State University); Xihaier Luo (Brookhaven National Laboratory); Yihui Ren (Brookhaven National Laboratory); Balu Nadiga (Los Alamos National Laboratory); Luke Van Roekel (Los Alamos National Laboratory); Han-Wei Shen (The Ohio State University); Shinjae Yoo (Brookhaven National Laboratory)
Abstract
Climate change presents a complex and critical challenge that spans forecasting across various temporal horizons to reconstruction using data from sparsely distributed sensors. Recent advancements in deep learning have shown promising results in emulating complex climate dynamics and reconstructing physical fields using real-time measurements. Given their data-driven nature, it is crucial to investigate how these deep emulators learn and represent the underlying physics. This paper aims to address these concerns by employing Explainable Artificial Intelligence (XAI) techniques, focusing specifically on two methods--feature attribution and influence functions and demonstrating their explainability of a cutting-edge implicit neural network that learns a continuous and reliable representation from sparse sampling climate data.