{"title":"Rapid Damage Estimation with Iterative Improvements for Relief Resource Planning Post-Disasters","authors":"R. Garg, Yuxin Zhang, Linda Golden, P. Brockett","doi":"10.2139/ssrn.3254748","DOIUrl":null,"url":null,"abstract":"Natural disasters can disrupt both the short-term and long-term living conditions of individuals in selected areas, and for any natural disaster selecting a large geographic area, there is always a shortage of funds and resources for relief and recovery. Moreover, the allocation of relief assistance is challenging due to difficulties in collecting accurate loss information during and immediately post-crisis. Thus, in this study, we present a data-driven decision-making framework for disaster management and provide a model for the rapid estimation of disaster losses using an iterative learning method. As disaster loss data are largely initially unavailable and only become available gradually, slowly, and sparsely over time, an iterative process is necessary for loss estimation and on-the-ground decision-making. We first train models to predict losses using single environmental factors (e.g., peak wind speed) for impacted locations and then use geospatial interpolation to estimate losses in those areas where actual loss data are missing. As real, verified loss data become available, we iteratively update all model parameters and estimates. To illustrate this technique, we use data from Hurricane Harvey, which hit the gulf coast area of the USA in 2017. The results demonstrate that iterative learning leads to quick convergence of loss estimation, with small magnitudes of estimation error. Additional tests demonstrate that the results are arguably robust, and we conclude with implications for future research.","PeriodicalId":265524,"journal":{"name":"Urban & Regional Resilience eJournal","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Urban & Regional Resilience eJournal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.3254748","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Natural disasters can disrupt both the short-term and long-term living conditions of individuals in selected areas, and for any natural disaster selecting a large geographic area, there is always a shortage of funds and resources for relief and recovery. Moreover, the allocation of relief assistance is challenging due to difficulties in collecting accurate loss information during and immediately post-crisis. Thus, in this study, we present a data-driven decision-making framework for disaster management and provide a model for the rapid estimation of disaster losses using an iterative learning method. As disaster loss data are largely initially unavailable and only become available gradually, slowly, and sparsely over time, an iterative process is necessary for loss estimation and on-the-ground decision-making. We first train models to predict losses using single environmental factors (e.g., peak wind speed) for impacted locations and then use geospatial interpolation to estimate losses in those areas where actual loss data are missing. As real, verified loss data become available, we iteratively update all model parameters and estimates. To illustrate this technique, we use data from Hurricane Harvey, which hit the gulf coast area of the USA in 2017. The results demonstrate that iterative learning leads to quick convergence of loss estimation, with small magnitudes of estimation error. Additional tests demonstrate that the results are arguably robust, and we conclude with implications for future research.