Poster

Precipitation Downscaling with Diffusion Models Conditioned on the Prithvi Weather Foundation Model

Abstract

Precipitation downscaling is essential for applications such as climate modeling, hydrological simulations, and renewable energy planning. Stochastic downscaling methods, which aim to generate realistic fine-scale precipitation patterns from coarse atmospheric predictors, are particularly important due to the high spatial and temporal variability of rainfall. Despite significant advances in modeling techniques and computational power, precipitation downscaling still remain less accurate and reliable compared to most other meteorological variables, highlighting the need for continued research.

In this work, we explore how AI Foundation Models can improve downscaling under the hypothesis that these very large models can better encode physical aspects of weather phenomena. We investigate precipitation downscaling at 0.1° resolution using diffusion models conditioned on ERA5 atmospheric variables and ERA5-Land static fields via embeddings from the Prithvi Weather Foundation Model. The conditioning data include surface and vertical (pressure-level) variables from ERA5 aggregated to 1° daily resolution, combined with ERA5-Land static fields. The target dataset is ERA5-Land daily precipitation at 0.1°.

Unlike other approaches, we do not use ERA5 precipitation at coarse resolution; instead, we generate precipitation from noise guided by context variables, so called, channel-synthesis. We explore multiple conditioning strategies, including concatenation, cross-attention with a convolutional encoder, and feature embeddings derived from the Prithvi model to enhance performance. Experimental results show that diffusion models conditioned on Prithvi embeddings improve high-resolution precipitation generation compared to baseline conditioning models.