Improving Contrail Detection via Diffusion-Based Data Augmentation Framework (Papers Track)

Yejun Lee (UNIST); Jaejun Yoo (UNIST)

Paper PDF Slides PDF Poster File Cite
Generative Modeling Computer Vision & Remote Sensing

Abstract

Contrails, ice-forming clouds produced by aircraft, significantly contribute to Earth’s radiative forcing and have been the subject of extensive research aimed at mitigation. With the advancement of deep learning, efforts to continuously detect contrails have intensified. However, developing high-performance models for contrail detection remains a challenge due to the scarcity of data and severe class imbalance issues. In this paper, we propose a diffusion model-based data augmentation framework to tackle these challenges. Our framework consists of two stages: 1) Contrail Mask Generation and 2) Mask-conditioned Contrail Scene Generation. The proposed framework can generate diverse contrail shapes not present in the original dataset and synthesize contrail scenes with various backgrounds based on these masks. Through extensive experiments, we demonstrate that the framework effectively generates diverse and novel scenes, and the generated data significantly improves the performance of downstream tasks. To the best of our knowledge, this is the first study to apply a diffusion model-based data augmentation technique to the contrail detection task.