Published March 30, 2021 | Version v1
Preprint Open

Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing Data

  • 1. ElementAI / ServiceNow
  • 2. Universitat Politècnica de Catalunya

Description

Remote sensing and automatic earth monitoring are key to solve global-scale challenges such as disaster prevention, land use monitoring, or tackling climate change. Although there exist vast amounts of remote sensing data, most of it remains unlabeled and thus inaccessible for supervised learning algorithms. Transfer learning approaches can reduce the data requirements of deep learning algorithms. However, most of these methods are pre-trained on ImageNet and their generalization to remote sensing imagery is not guaranteed due to the domain gap. In this work, we propose Seasonal Contrast (SeCo), an effective pipeline to leverage unlabeled data for in-domain pre-training of remote sensing representations. The SeCo pipeline is composed of two parts. First, a principled procedure to gather large-scale, unlabeled and uncurated remote sensing datasets containing images from multiple Earth locations at different timestamps. Second, a self-supervised algorithm that takes advantage of time and position invariance to learn transferable representations for remote sensing applications. We empirically show that models trained with SeCo achieve better performance than their ImageNet pre-trained counterparts and state-of-the-art self-supervised learning methods on multiple downstream tasks. The datasets and models in SeCo will be made public to facilitate transfer learning and enable rapid progress in remote sensing applications.

Files

seco_100k.zip

Files (44.9 GB)

Name Size Download all
md5:ebf2d5e03adc6e657f9a69a20ad863e0
7.3 GB Preview Download
md5:187963d852d4d3ce6637743ec3a4bd9e
36.3 GB Preview Download
md5:dcf336be31f6c6b0e77dcb6cc958fca8
171.3 MB Download
md5:53d5c41d0f479bdfd31d6746ad4126db
171.3 MB Download
md5:9672c303f6334ef816494c13b9d05753
468.5 MB Download
md5:7b09c54aed33c0c988b425c54f4ef948
468.5 MB Download

Additional details

Related works

Is supplement to
Preprint: https://arxiv.org/abs/2103.16607 (URL)