Planned intervention: On Wednesday April 3rd 05:30 UTC Zenodo will be unavailable for up to 2-10 minutes to perform a storage cluster upgrade.
Published May 9, 2019 | Version v1
Report Open

Next Generation Disaster Data Infrastructure White Paper (DRAFT)

  • 1. Tonkin and Taylor International, New Zealand
  • 2. CODATA Task Group on Linked Open Data for Global Disaster Risk Research
  • 3. Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, China
  • 4. National Disaster Management Authority, India
  • 5. Arab Urban Development Institute, Kingdom of Saudi Arabia

Description

The work of Linked Open Data for Global Disaster Risk Research (LODGD) task group of CODATA is an increasingly important activity linking four global milestones – the Sendai Framework for Disaster Risk Reduction (SFDRR), Sustainable Development Goal (SDG), Paris Agreement for Climate Change and the New Urban Agenda (NUA)-Habitat III. The Sendai Framework recognises this need in its guiding principles: ‘Disaster risk reduction requires a multi-hazard approach and inclusive risk-informed decision-making based on the open exchange and dissemination of disaggregated data, including by sex, age and disability, as well as on easily accessible, up-to-date, comprehensible, science-based, non-sensitive risk information….’ (Sendai Framework 2015 paragraph 19g). However, assessment processes are challenging, as they require collaboration and participation across multiple sectors, data integration, interpretation as well as the establishment of a mechanism to share data within and across UN member states, the UN system and other stakeholders. The LODGD produced series of white papers to provide policy guidance, technical understanding on data and disaster science to informs readers concisely about a complex issues, gap analysis on data interconnectivity, data infrastructure and data driven policies on disaster risk reduction. 

In this regards, the draft white paper 2 on "Next generation disaster data infrastructure" has been structured and drafted the first version for comments and review. This white paper proposes the next generation disaster data infrastructure, which includes the novel and most essential information systems and services that a country or a region can depend on in reality in order to successfully gather, process and display disaster data, and to reduce the impact of natural disasters. Fundamental requirements of the disaster data infrastructure include (1) effective multi source big disaster data collection (2) efficient big disaster data fusion, exchange, and query, (3) strict big disaster data quality control and standard construction, (4) real-time big data analysis and decision making and (5) user-friendly big data visualization. The rest of the paper is organized as follows: first, several future scenarios of disaster management are developed based on existing disaster management systems and communication technology. Second, fundamental requirements of next generation disaster data infrastructure inspired by the proposed scenarios are discussed. Following that, research questions and issues are highlighted. Finally, suggestions and conclusion are given at the end of the paper.

The draft white paper 2 is released for comments to coincide with the upcoming Global Platform in Geneva, 13-17 May 2019. The draft white paper can be accessed here as Word or PDF or from Google Drive https://docs.google.com/document/d/1qG4EyjuA6i2rEIjnVX2f7b33YJ5x3FYrdZJtG_pK2cI/edit?usp=sharing.  We invite comments, constructive criticism and suggestions for improvement by 30 June 2019. Please add comments and suggestions to the Google doc or send them to Edward Chu <edwardchu@yuntech.edu.tw> and  Bapon Fakhruddin <BFakhruddin@tonkintaylor.co.nz>.  The report will be refine and improved thanks to your inputs and suggestions.

Files

white paper beta_0.1.pdf

Files (3.2 MB)

Name Size Download all
md5:d7d8cc25d5a5f9fdae98016b5cb009c6
754.6 kB Download
md5:ed0551ad841c1e2f31a33aeedbf23317
2.5 MB Preview Download