Logo GRF IDRC 2012

Conference Agenda

Overview and details of the sessions of this conference. Please select a date or room to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Session Overview
MON4.4: Global exposure monitoring for multi-hazards risk assessments
Time: Monday, 27/Aug/2012: 1:00pm - 2:30pm
Session Chair: Daniele EHRLICH, Joint Research Centre, European Commission
Location: Seehorn

Session organized by the Joint Research Centre, European Commission


Global exposure monitoring for multi-hazards risk assessments

Mauro DOLCE1, Daniele EHRLICH2

1Bureau for Seismic and Volcanic Risk, Italian Civil Protection Department; 2Joint Research Centre, European Commission, Italy, Republic of

Disaster risk analysis is used for estimating potential future disaster and losses, and a guiding mechanism for implementing disaster risk reduction measures. Through international advocacy initiatives such as the ISDR promoted Hyogo protocol, an increasing number of countries are now including disaster risk analysis in their policies. However, the implementation of these policies requires knowledge that is not always available. Datasets, models and tools used in disaster risk assessment are often not available especially in low income countries where the information is needed most. A number of initiatives with a global scope are addressing this lack of data, of models, while other initiatives are already developing datasets and tools. In addition the research community is providing innovation that provides new opportunities for generating the required knowledge.

The session aims to provide an overview of the need for information on exposure, hazard and vulnerability, an overview on current international initiatives that address such needs and the challenges ahead.

The session will have four panellists. The first panellist will address the need for global multi risk modelling and thus the need to derive required exposure, vulnerability and hazard information. Panellist two and three will provide state of the art in exposure mapping and disaster risk modelling. The last panellist will provide an insight on future technological development that may be used to rapidly generate the missing datasets.

World Bank/GFDRR contributions to exposure modeling for global risk modeling initiatives and OpenDRI initiative

Keiko SAITO, Daniel KULL, Robert SODEN, Abigail BACA

World Bank, United States of America

Risk assessments for natural hazards are starting points for disaster mitigation activities. The results from the assessments allow the stakeholder to have an understanding of the underlying risk present in the location in question. This enables the planning of appropriate interventions to be made. Risk is often defined as the product of the hazard, exposure (e.g. physical assets) and its vulnerability given certain hazard intensities. The quality of the exposure data that is fed into the risk models has a significant impact on the output from the risk model. Exposure data has traditionally been collected using official census data. Often in these cases, aggregate level statistics required disaggregation for the data to match the scale of the geographical unit used for the risk modeling. In the recent years, bottom up methodologies to model the assets exposed against potential natural hazards. Some involve the use of tools including hand held devices, remotely sensed data and sampling schemes.

GFDRR is contributing to several global risk modeling initiatives, namely Global Assessment Report (GAR) led by UN ISDR, Global Earthquake Model and the CAPRA (Caribbean Probabilistic Risk Assessment) initiative. Applying a standardized methodology to the collection of exposure data allows for the risks from disasters to be compared across geographical boundaries. An overview of GFDRR’s contribution will be presented. At the same time, data collected through these risk assessment activities should ideally be made open so that the potential of the data can be maximized. Open Data for Resilience Initiative (OpenDRI), an initiative led by GFDRR Labs, is promoting the use of open source data platforms to store and visualize these datasets. Some example projects that have successfully collected open exposure data, hosted by national disaster management agencies will be introduced.

Building a global exposure database

Paolo GAMBA1, Helen CROWLEY2, Nicole KELLER2

1Dipartimento di Ingegneria Industriale e dell'Informazione, University of Pavia, Italy; 2GEM Foundation, Italy

The Global Earthquake Model (GEM) is a global collaborative effort to provide organisations and people with open tools and resources for transparent assessment of earthquake risk anywhere in the world. Leading science is leveraged for the benefit of society; hundreds of individuals and organisations are working together through global projects and regional programmes to develop open-source tools, global datasets and best practices that follow the state-of-the-art in science on seismic hazard and risk. All contributions are integrated into a comprehensive platform (OpenQuake) that will become available in 2014.

High-resolution and standardized exposure data is key to quantifying risk. One of the global projects that is currently carried out focuses on development of a global exposure database (GED); an open unified database that contains the data needed to estimate damage to buildings, (critical) infrastructure and human casualties, and for cost-benefit and other analyses that support decisions on risk mitigation, such as building retrofitting. To ensure that data can be used for risk assessment in the same manner around the globe, the GED4GEM consortium develops best practices for creation of exposure datasets. To account for regional variability, they do so in interaction with experts worldwide. By harmonizing the best public exposure datasets, the consortium is putting together a first comprehensive global dataset of building stock and population.

Organisations and individuals worldwide will be able to view and explore the GED in a GIS-environment; use subsets of it for own analyses and submit data to collaboratively enhance it. Using the best practices proposed by GED4GEM, another consortium is developing tools to capture new data on individual buildings and from remote sensing. New datasets at various scales can furthermore be integrated into the GED, following clear guidelines. Through these processes of crowdsourcing and continuous updating, users worldwide will be able to carry out risk analyses with increasing accuracy.

Processing satellite imagery for mapping physical exposure globally

Daniele EHRLICH, Stamatia HALKIA, Thomas KEMPER, Martino PESARESI, Pierre SOILLE

Joint Research Centre, European Commission, Italy, Republic of

Disaster risk models require exposure and hazard information. Physical exposure – information on villages, towns, cities and metropolitans areas - is still not available in a standardized form for local to national assessment, for properly quantifying disaster hotspots globally, or for between countries risk comparisons. The Global Exposure Database for Global Earthquake Model is the first initiative that addresses the systematic collection and population of a global exposure database. That database also needs to be populated. A potential source for up to date exposure information is provided by the large volume of satellite imagery available in image archives that are continuously updated. Medium scale and very fine scale satellite imagery is collected by space agencies and increasingly by private satellite operators. These data awaits now to be processed into exposure information that can be used within disaster risk models. That conversion from imagery displaying the surface of the Earth into human settlement layers and then exposure parameters to be used in risk models is underway. However, image processing information technology infrastructure is not typically designed to process massive volume of data covering countries and continents. New initiatives like the Global Human Settlement Layer analysis system developed at the Joint Research Centre and presented herein aims to analyze human settlements globally. The system can process the gigantic data volume required to cover continents or even the entirety of the Earth’s land masses. The presentations will illustrate examples of human settlement layers to be used as proxy variables for exposure. In particular, the presentation will show example of continental wide human settlement mapping, on systematic comparison of largest metropolitan areas, on changes in the extent of human settlement in time, and will briefly illustrate the complexity of human settlements as seen from very detailed satellite imagery.

Contact and Legal Notice · Contact Address:
Conference: GRF IDRC 2012
Conference Software - ConfTool Pro 2.6.49+TC
© 2001 - 2012 by H. Weinreich, Hamburg, Germany