Dataset for downscaling, used in downscaled Spatiotemporal Precipitation Model Based on a Transformer Attention Mechanism
Creators
Description
In this research, we introduce a novel method leveraging the Transformer architecture to generate high-fidelity precipitation model outputs. This technique emulates the statistical characteristics of high-resolution datasets while substantially lowering computational expenses. The core concept involves utilizing a blend of coarse and fine-grained simulated precipitation data, encompassing diverse spatial resolutions and geospatial distributions, to instruct the neural network in the transformation process. We have crafted an innovative ST-Transformer encoder component that dynamically concentrates on various regions, allocating heightened focus to critical spatial zones or sectors. This tailored module is instrumental in enhancing the model's ability to generate outcomes that are not only more true-to-life but also more consistent with physical laws. It adeptly mirrors the temporal and spatial fluctuations in precipitation data and adeptly represents extreme weather events, such as heavy and enduring storms. The efficacy and superiority of our proposed approach are substantiated through a comparative analysis with several cutting-edge forecasting techniques. This evaluation is conducted on two distinct datasets, each derived from simulations run by regional climate models over a period of four months. The datasets vary in their spatial resolutions, with one featuring a 50-kilometer resolution and the other a 12-kilometer resolution, both sourced from the Weather Research and Forecasting (WRF) Model.
Files
Files
(399.7 MB)
Name | Size | Download all |
---|---|---|
md5:fc821362db9443e5a7c4e2ee763f3cb2
|
399.7 MB | Download |