Using the NCI Gadi Supercomputer to revolutionise processing of MT time series data: results from the GeoDeVL experiment
Creators
- 1. National Computational Infrastructure, Australian National University, Canberra, ACT, nigel.rees@anu.edu.au
- 2. Research School of Earth Sciences, Australian National University, Canberra, ACT.
- 3. National Computational Infrastructure, Australian National University, Canberra, ACT
- 4. National Computational Infrastructure, and Research School of Earth Sciences, Australian National University, Canberra, ACT
- 5. AuScope, School of Earth Sciences, University of Melbourne, Victoria.
- 6. OPM Consulting, Canberra, ACT.
Description
MagnetoTelluric (MT) time series datasets are expensive to acquire, can be high volume (100s of terabytes), and the time taken to publish (measured from collection to release) often takes more than two years. Time series datasets have been notoriously hard to access: most data providers only make derivative MT transfer functions (EDI files) and model outputs accessible online. Hence, MT practitioners can be reliant on the data processing from raw data to be conducted by others, which may or may not meet their target depth or processing requirements. There is a growing demand for time series datasets to be more accessible to facilitate alternative processing methods, particularly on HPC infrastructures, which enable processing of time series datasets at full resolution and running of larger models with more ensemble members and uncertainty quantification. To address these issues, the GeoDeVL project experimented with a rapid open, transparent field-to-desktop-to-publication workflow to process and publish MT time series datasets using the new 15 Petaflop Gadi supercomputer at NCI. To do this, parallelised codes were developed to automate the generation of Level 0 to 1 time series data. Creating time series data levels for 95 Earth Data Logger stations now takes minutes, versus days and weeks previously taken using more traditional processing methods. The process developed under the GeoDeVL project showed how geophysicists can now work with less processed data and transparently develop their own derivative products that are more tuned to the specific parameters of their use case. Further, as new processing methodologies and/or higher capacity computers become available, the rawer forms of earlier surveys are still available for reprocessing. Comparable trials in HPC processing decades ago led to widespread use of HPC in the petroleum exploration industry: will these results lead to similar uptake of HPC in the minerals exploration industry?
Notes
Files
ID136.pdf
Files
(384.6 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:27b8c603649499401eed7b47bff8f35c
|
384.6 kB | Preview Download |