Report Open Access
Fakhruddin, Bapon; Chu, Edward; Li, Guoqing; Xie, Jibo; Lu, Kun-Lin; Sharma, Kunal; Chen, I-Ching; Eltinay, Nuha; Dalal, Saurabh; Yang, Tengfei
The work of Linked Open Data for Global Disaster Risk Research (LODGD) task group of CODATA is an increasingly important activity linking four global milestones – the Sendai Framework for Disaster Risk Reduction (SFDRR), Sustainable Development Goal (SDG), Paris Agreement for Climate Change and the New Urban Agenda (NUA)-Habitat III. The Sendai Framework recognises this need in its guiding principles: ‘Disaster risk reduction requires a multi-hazard approach and inclusive risk-informed decision-making based on the open exchange and dissemination of disaggregated data, including by sex, age and disability, as well as on easily accessible, up-to-date, comprehensible, science-based, non-sensitive risk information….’ (Sendai Framework 2015 paragraph 19g). However, assessment processes are challenging, as they require collaboration and participation across multiple sectors, data integration, interpretation as well as the establishment of a mechanism to share data within and across UN member states, the UN system and other stakeholders. The LODGD produced series of white papers to provide policy guidance, technical understanding on data and disaster science to informs readers concisely about a complex issues, gap analysis on data interconnectivity, data infrastructure and data driven policies on disaster risk reduction.
In this regards, the draft white paper 2 on "Next generation disaster data infrastructure" has been structured and drafted the first version for comments and review. This white paper proposes the next generation disaster data infrastructure, which includes the novel and most essential information systems and services that a country or a region can depend on in reality in order to successfully gather, process and display disaster data, and to reduce the impact of natural disasters. Fundamental requirements of the disaster data infrastructure include (1) effective multi source big disaster data collection (2) efficient big disaster data fusion, exchange, and query, (3) strict big disaster data quality control and standard construction, (4) real-time big data analysis and decision making and (5) user-friendly big data visualization. The rest of the paper is organized as follows: first, several future scenarios of disaster management are developed based on existing disaster management systems and communication technology. Second, fundamental requirements of next generation disaster data infrastructure inspired by the proposed scenarios are discussed. Following that, research questions and issues are highlighted. Finally, suggestions and conclusion are given at the end of the paper.
The draft white paper 2 is released for comments to coincide with the upcoming Global Platform in Geneva, 13-17 May 2019. The draft white paper can be accessed here as Word or PDF or from Google Drive https://docs.google.com/document/d/1qG4EyjuA6i2rEIjnVX2f7b33YJ5x3FYrdZJtG_pK2cI/edit?usp=sharing. We invite comments, constructive criticism and suggestions for improvement by 30 June 2019. Please add comments and suggestions to the Google doc or send them to Edward Chu <email@example.com> and Bapon Fakhruddin <BFakhruddin@tonkintaylor.co.nz>. The report will be refine and improved thanks to your inputs and suggestions.