Lightweight, Pre-trained Transformers for Remote Sensing Timeseries (Papers Track) Spotlight

Gabriel Tseng (NASA Harvest); Ruben Cartuyvels (KULeuven); Ivan Zvonkov (University of Maryland); Mirali Purohit (Arizona State University (ASU)); David Rolnick (McGill University, Mila); Hannah R Kerner (Arizona State University)

Paper PDF Recorded Talk NeurIPS 2023 Poster Cite
Earth Observation & Monitoring Unsupervised & Semi-Supervised Learning

Abstract

Machine learning models for parsing remote sensing data have a wide range of societally relevant applications, but labels used to train these models can be difficult or impossible to acquire. This challenge has spurred research into self-supervised learning for remote sensing data. Current self-supervised learning approaches for remote sensing data draw significant inspiration from techniques applied to natural images. However, remote sensing data has important differences from natural images -- for example, the temporal dimension is critical for many tasks and data is collected from many complementary sensors. We show we can create significantly smaller performant models by designing architectures and self-supervised training techniques specifically for remote sensing data. We introduce the Pretrained Remote Sensing Transformer (Presto), a transformer-based model pre-trained on remote sensing pixel-timeseries data. Presto excels at a wide variety of globally distributed remote sensing tasks and performs competitively with much larger models while requiring far less compute. Presto can be used for transfer learning or as a feature extractor for simple models, enabling efficient deployment at scale.

Recorded Talk (direct link)

Loading…