Published May 23, 2022 | Version v1
Poster Open

Filling Cloud Gaps on Optical Time-Series through Optical and SAR Data Fusion for Cropland Monitoring

Description

Cloud cover is a common issue with optical satellite images, severely affecting the quality and the spatiotemporal availability of surface observations. Clouds can critically impact the utility of satellite images by completely covering the ground below, or distorting the measurements collected. While a generally known practice is cloud identification and removal of affected areas, there is also a growing interest in filling the gaps created with the use of powerful data-driven deep learning methods. Images produced by the Sentinel-2 mission come with such cloud occlusions and a common alternative is the use of Synthetic Aperture Radar (SAR) data, as they are nearly independent of the atmospheric conditions and solar illumination. However, SAR data share entirely different characteristics compared to optical data. Even though Sentinel-1 has the ability to provide continuous, day-and-night observations and to overcome various kinds of bad weather conditions or poor air quality (e.g., clouds, rain, fog and smoke), the information captured by this mission is less descriptive and more complex to interpret than that of optical images. In that case, Generative Adversarial Networks (GANs) are used to translate SAR to optical imagery, while many researchers have recently proposed the information fusion of SAR (Sentinel-1) and optical (Sentinel-2) images with different motives.

 

Files

Filling_Cloud_Gaps_LPS_2022.pdf

Files (784.8 kB)

Name Size Download all
md5:9077ef3c8bec68b67d700842c6709dfc
784.8 kB Preview Download

Additional details

Funding

CALLISTO – Copernicus Artificial Intelligence Services and data fusion with other distributed data sources and processing at the edge to support DIAS and HPC infrastructures 101004152
European Commission