Refine Search

New Search

Results in Journal Natural Hazards and Earth System Sciences: 3,430

(searched for: journal_id:(78595))
Page of 69
Articles per Page
by
Show export options
  Select all
, Libo Han, Feng Long, Guijuan Lai, Fengling Yin, Jinmeng Bi, Zhengya Si
Natural Hazards and Earth System Sciences, Volume 21, pp 2233-2244; doi:10.5194/nhess-21-2233-2021

Abstract:
The spatiotemporal heterogeneity of b values has great potential for helping in understanding the seismogenic process and assessing seismic hazard. However, there is still much controversy about whether it exists or not, and an important reason is that the choice of subjective parameters has eroded the foundations of much research. To overcome this problem, we used a recently developed non-parametric method based on a data-driven concept to calculate b values. The major steps of this method include (1) performing a large number of Voronoi tessellations and Bayesian information criterion (BIC) value calculation, selecting the optimal models for the study area, and (2) using the ensemble median (Q2) and median absolute deviation (MAD) value to represent the final b value and its uncertainty. We investigated spatiotemporal variations in b values before and after the 2019 Changning MS=6.0 earthquake in the Sichuan Basin, China. The results reveal a spatial volume with low pre-mainshock b values near the mainshock source region, and its size corresponds roughly with the rupture area of the mainshock. The anomalously high pre-mainshock b values distributed in the NW direction of the epicenter were interpreted to be related to fluid invasion. The decreases in b values during the aftershock sequence along with the occurrences of several strong aftershocks imply that b values could be an indicator of the stress state. In addition, we found that although the distribution characteristics of b values obtained from different methods of investigation are qualitatively consistent, they differ significantly in terms of their specific values, suggesting that the best way to study the heterogeneous pattern of b values is in the joint dimension of space-time rather than separately in time and space. Overall, our study emphasizes the importance of b-value studies in assessing earthquake hazards.
, Clemens Strehl, Fabian Vollmer, Eduard Interwies, Anasha Petersen, Stefan Görlitz, , Montse Martinez Puentes, , , et al.
Natural Hazards and Earth System Sciences, Volume 21, pp 2145-2161; doi:10.5194/nhess-21-2145-2021

Abstract:
As Europe is faced with increasing droughts and extreme precipitation, countries are taking measures to adapt to these changes. It is challenging, however, to navigate through the wide range of possible measures, taking into account the efficacy, economic impact and social justice aspects of these measures, as well as the governance requirements for implementing them. This article presents the approach of selecting and analysing adaptation measures to increasing extreme weather events caused by ongoing climate change that was developed and applied in the H2020 project BINGO (Bringing Innovation to Ongoing Water Management). The purpose of this project is (a) to develop an integrated participatory approach for selecting and evaluating adaptation measures, (b) to apply and evaluate the approach across six case-study river basins across Europe, and (c) to support decision-making towards adaptation capturing the diversity, the different circumstances and challenges river basins face across Europe. It combines three analyses: governance, socio-economic and social justice The governance analysis focuses on the requirements associated with the measures and the extent to which these requirements are met at the research sites. The socio-economic impact focuses on the efficacy of the measures in reducing the risks and the broad range of tools available to compare the measures on their societal impact. Finally, a tentative social justice analysis focuses on the distributive impacts of the adaptation measures. In the summary of results, we give an overview of the outcome of the different analyses. In the conclusion, we briefly assess the main pros and cons of the different analyses that were conducted. The main conclusion is that although the research sites were very different in both the challenges and the institutional context, the approach presented here yielded decision-relevant outcomes.
Natural Hazards and Earth System Sciences, Volume 21, pp 2125-2144; doi:10.5194/nhess-21-2125-2021

Abstract:
Landslide is a major natural hazard in Kyrgyzstan and Tajikistan. Knowledge about atmospheric triggering conditions and climatic disposition of landslides in Kyrgyzstan and Tajikistan is limited even though this topic has already been investigated thoroughly in other parts of the world. In this study, the newly developed, high-resolution High Asia Refined analysis version 2 (HAR v2) data set generated by dynamical downscaling was combined with historical landslide inventories to analyze the atmospheric conditions that initialized landslides in Kyrgyzstan and Tajikistan. The results indicate the crucial role of snowmelt in landslide-triggering processes since it contributes to the initialization of 40 % of landslide events. Objective thresholds for rainfall, snowmelt, and the sum of rainfall and snowmelt (rainfall + snowmelt) were defined. Thresholds defined by rainfall + snowmelt have the best predictive performance. Mean intensity, peak intensity, and the accumulated amount of rainfall + snowmelt events show similar predictive performance. Using the entire period of rainfall + snowmelt events results in better predictive performance than just considering the period up to landslide occurrence. Mean annual exceedance maps were derived from defined regional thresholds for rainfall + snowmelt. Mean annual exceedance maps depict climatic disposition and have added value in landslide susceptibility mapping. The results reported in this study highlight the potential of dynamical downscaling products generated by regional climate models in landslide prediction.
Nan Wang, , , , Liang Guo, Junnan Xiong
Natural Hazards and Earth System Sciences, Volume 21, pp 2109-2124; doi:10.5194/nhess-21-2109-2021

Abstract:
The persistence over space and time of flash flood disasters – flash floods that have caused either economical losses or loss of life or both – is a diagnostic measure of areas subjected to hydrological risk. The concept of persistence can be assessed via clustering analyses, performed here to analyze the national inventory of flash flood disasters in China that occurred in the period 1950–2015. Specifically, we investigated the spatiotemporal pattern distribution of the flash flood disasters and their clustering behavior by using both global and local methods: the first based on Ripley's K function, and the second on scan statistics. As a result, we could visualize patterns of aggregated events, estimate the cluster duration and make assumptions about their evolution over time, also with respect to the precipitation trend. Due to the large spatial (the whole Chinese territory) and temporal (66 years) scale of the dataset, we were able to capture whether certain clusters gather in specific locations and times but also whether their magnitude tends to increase or decrease. Overall, the eastern regions in China are much more subjected to flash flood disasters compared to the rest of the country. Detected clusters revealed that these phenomena predominantly occur between July and October, a period coinciding with the wet season in China. The number of detected clusters increases with time, but the associated duration drastically decreases in the recent period. This may indicate a change towards triggering mechanisms which are typical of short-duration extreme rainfall events. Finally, being flash flood disasters directly linked to precipitation and their extreme realization, we indirectly assessed whether the magnitude of the trigger itself has also varied through space and time, enabling considerations in the context of climatic change.
Natural Hazards and Earth System Sciences, Volume 21, pp 2093-2108; doi:10.5194/nhess-21-2093-2021

Abstract:
Tsunamis rarely occur in a specific area, and their occurrence is highly uncertain. Suddenly generated from their sources in deep water, they occasionally undergo tremendous amplification in shallow water to devastate low-lying coastal areas. Despite the advancement of computational power and simulation algorithms, there is a need for novel and rigorous approaches to efficiently predict coastal amplification of tsunamis during different disaster management phases, such as tsunami risk assessment and real-time forecast. This study presents convolution kernels that can instantly predict onshore waveforms of water surface elevation and flow velocity from observed/simulated wave data away from the shore. Kernel convolution involves isolating an incident-wave component from the offshore wave data and transforming it into the onshore waveform. Moreover, unlike previously derived ones, the present kernels are based on shallow-water equations with a damping term and can account for tsunami attenuation on its path to the shore with a damping parameter. Kernel convolution can be implemented at a low computational cost compared to conventional numerical models that discretise the spatial domain. The prediction capability of the kernel method was demonstrated through application to real-world tsunami cases.
Elias de Korte, , Eric Tellier
Natural Hazards and Earth System Sciences, Volume 21, pp 2075-2091; doi:10.5194/nhess-21-2075-2021

Abstract:
A Bayesian network (BN) approach is used to model and predict shore-break-related injuries and rip-current drowning incidents based on detailed environmental conditions (wave, tide, weather, beach morphology) on the high-energy Gironde coast, southwest France. Six years (2011–2017) of boreal summer (15 June–15 September) surf zone injuries (SZIs) were analysed, comprising 442 (fatal and non-fatal) drownings caused by rip currents and 715 injuries caused by shore-break waves. Environmental conditions at the time of the SZIs were used to train two separate Bayesian networks (BNs), one for rip-current drownings and the other one for shore-break wave injuries. Each BN included two so-called “hidden” exposure and hazard variables, which are not observed yet interact with several of the observed (environmental) variables, which in turn limit the number of BN edges. Both BNs were tested for varying complexity using K-fold cross-validation based on multiple performance metrics. Results show a poor to fair predictive ability of the models according to the different metrics. Shore-break-related injuries appear more predictable than rip-current drowning incidents using the selected predictors within a BN, as the shore-break BN systematically performed better than the rip-current BN. Sensitivity and scenario analyses were performed to address the influence of environmental data variables and their interactions on exposure, hazard and resulting life risk. Most of our findings are in line with earlier SZI and physical hazard-based work; that is, more SZIs are observed for warm sunny days with light winds; long-period waves, with specifically more shore-break-related injuries at high tide and for steep beach profiles; and more rip-current drownings near low tide with near-shore-normal wave incidence and strongly alongshore non-uniform surf zone morphology. The BNs also provided fresh insight, showing that rip-current drowning risk is approximately equally distributed between exposure (variance reduction Vr=14.4 %) and hazard (Vr=17.4 %), while exposure of water user to shore-break waves is much more important (Vr=23.5 %) than the hazard (Vr=10.9 %). Large surf is found to decrease beachgoer exposure to shore-break hazard, while this is not observed for rip currents. Rapid change in tide elevation during days with large tidal range was also found to result in more drowning incidents. We advocate that such BNs, providing a better understanding of hazard, exposure and life risk, can be developed to improve public safety awareness campaigns, in parallel with the development of more skilful risk predictors to anticipate high-life-risk days.
Natural Hazards and Earth System Sciences, Volume 21, pp 2059-2073; doi:10.5194/nhess-21-2059-2021

Abstract:
A new homogenized earthquake catalogue for Turkey is compiled for the period 1900–2018. The earthquake parameters are obtained from the Bulletin of International Seismological Centre that was fully updated in 2020. New conversion equations between moment magnitude and the other scales (md, ML, mb, Ms, and M) are determined using the general orthogonal regression method to build up a homogeneous catalogue, which is the essential database for seismic hazard studies. The 95 % confidence intervals are estimated using the bootstrap method with 1000 samples. The equivalent moment magnitudes (Mw*) for the entire catalogue are calculated using the magnitude relations to homogenize the catalogue. The magnitude of completeness is 2.7 Mw*. The final catalogue is not declustered or truncated using a threshold magnitude in order to be a widely usable catalogue. It contains not only Mw* but also the average and median of the observed magnitudes for each event. Contrary to the limited earthquake parameters in the previous catalogues for Turkey, the 45 parameters of ∼378 000 events are presented in this study.
Andrea Abbate, Monica Papini,
Natural Hazards and Earth System Sciences, Volume 21, pp 2041-2058; doi:10.5194/nhess-21-2041-2021

Abstract:
This paper presents an extended reanalysis of the rainfall-induced geo-hydrological events that have occurred in the last 70 years in the alpine area of the Lombardy region, Italy. The work is focused on the description of the major meteorological triggering factors that have caused diffuse episodes of shallow landslides and debris flow. The aim of this reanalysis was to try to evaluate their magnitude quantitatively. The triggering factors were studied following two approaches. The first one started from the conventional analysis of the rainfall intensity (I) and duration (D) considering local rain gauge data and applying the I–D threshold methodology integrated with an estimation of the events' return period. We then extended this analysis and proposed a new index for the magnitude assessment (magnitude index, MI) based on frequency–magnitude theory. The MI was defined considering both the return period and the spatial extent of each rainfall episode. The second approach is based on a regional-scale analysis of meteorological triggers. In particular, the strength of the extratropical cyclone (EC) structure associated with the precipitation events was assessed through the sea level pressure tendency (SLPT) meteorological index. The latter has been estimated from the Norwegian cyclone model (NCM) theory. Both indexes have shown an agreement in ranking the event's magnitude (R2=0.88), giving a similar interpretation of the severity that was also found to be in accordance with the information reported in historical databases. This back analysis of 70 years in Valtellina identifies the MI and the SLPT as good magnitude indicators of the event, confirming that a strong cause–effect relationship exists among the EC intensity and the local rainfall recorded on the ground. In respect of the conventional I–D threshold methodology, which is limited to a binary estimate of the likelihood of landslide occurrence, the evaluation of the MI and the SLPT indexes allows quantifying the magnitude of a rainfall episode capable of generating severe geo-hydrological hazards.
, Ivan D. Haigh, Ahmed A. Nasr, , , Robert J. Nicholls
Natural Hazards and Earth System Sciences, Volume 21, pp 2021-2040; doi:10.5194/nhess-21-2021-2021

Abstract:
In coastal regions, floods can arise through a combination of multiple drivers, including direct surface run-off, river discharge, storm surge, and waves. In this study, we analyse compound flood potential in Europe and environs caused by these four main flooding sources using state-of-the-art databases with coherent forcing (i.e. ERA5). First, we analyse the sensitivity of the compound flooding potential to several factors: (1) sampling method, (2) time window to select the concurrent event of the conditioned driver, (3) dependence metrics, and (4) wave-driven sea level definition. We observe higher correlation coefficients using annual maxima than peaks over threshold. Regarding the other factors, our results show similar spatial distributions of the compound flooding potential. Second, the dependence between the pairs of drivers using the Kendall rank correlation coefficient and the joint occurrence are synthesized for coherent patterns of compound flooding potential using a clustering technique. This quantitative multi-driver assessment not only distinguishes where overall compound flooding potential is the highest, but also discriminates which driver combinations are more likely to contribute to compound flooding. We identify that hotspots of compound flooding potential are located along the southern coast of the North Atlantic Ocean and the northern coast of the Mediterranean Sea.
, Konstantinos Lagouvardos, Vassiliki Kotroni, , Christos Giannaros
Natural Hazards and Earth System Sciences, Volume 21, pp 1983-2000; doi:10.5194/nhess-21-1983-2021

Abstract:
An integrated modeling approach for forecasting flood events is presented in the current study. An advanced flood forecasting model, which is based on the coupling of hydrological and atmospheric components, was used for a twofold objective: first to investigate the potential of a coupled hydrometeorological model to be used for flood forecasting at two medium-size drainage basins in the area of Attica (Greece) and second to investigate the influence of the use of the coupled hydrometeorological model on the precipitation forecast skill. For this reason, we used precipitation and hydrometric in situ data for six flood events at two selected drainage regions of Attica. The simulations were carried out with the Weather Research and Forecasting (WRF) model (WRF-only) and the WRF-Hydro system in a fully coupled mode, under which surface, subsurface, and channel hydrological processes were parameterized at a fine-resolution grid of 95 m approximately. Results showed that the coupled WRF-Hydro system was capable of producing the observed discharge during the flood episodes, after the adequate calibration method applied at the studied basins. This outcome provides confidence that the model configuration under the two-way atmospheric–hydrological coupling is robust and, thus, can be used for operational flood forecasting purposes in the area of Attica. In addition, the WRF-Hydro model showed a tendency to slightly improve the simulated precipitation in comparison to the precipitation produced by the atmospheric-only version of the model (WRF), demonstrating the capability of the coupled WRF-Hydro model to enhance the precipitation forecast skill for operational flood predictions.
Natural Hazards and Earth System Sciences, Volume 21, pp 2001-2020; doi:10.5194/nhess-21-2001-2021

Abstract:
Windstorms are a major natural hazard in many countries. The objective of this study is to identify and characterize intense windstorms during the last 4 decades in the US Northeast and determine both the sources of cyclones responsible for these events and the manner in which those cyclones differ from the cyclone climatology. The windstorm detection is based on the spatial extent of locally extreme wind speeds at 100 m height from the ERA5 reanalysis database. During the top 10 windstorms, wind speeds exceed their local 99.9th percentile over at least one-third of land-based ERA5 grid cells in this high-population-density region of the USA. Maximum sustained wind speeds at 100 m during these windstorms range from 26 to over 43 ms−1, with wind speed return periods exceeding 6.5 to 106 years (considering the top 5 % of grid cells during each storm). Property damage associated with these storms, with inflation adjusted to January 2020, ranges from USD 24 million to over USD 29 billion. Two of these windstorms are linked to decaying tropical cyclones, three are Alberta clippers, and the remaining storms are Colorado lows. Two of the 10 re-intensified off the east coast, leading to development of nor'easters. These windstorms followed frequently observed cyclone tracks but exhibit maximum intensities as measured using 700 hPa relative vorticity and mean sea level pressure that is 5–10 times the mean values for cyclones that followed similar tracks over this 40-year period. The time evolution of wind speeds and concurrent precipitation for those windstorms that occurred after the year 2000 exhibit good agreement with in situ ground-based and remote sensing observations, plus storm damage reports, indicating that the ERA5 reanalysis data have a high degree of fidelity for large, damaging windstorms such as these. A larger pool of the top 50 largest windstorms exhibit evidence of only weak serial clustering, which is in contrast to the relatively strong serial clustering of windstorms in Europe.
, Bryan A. Black, Yong Wei,
Natural Hazards and Earth System Sciences, Volume 21, pp 1971-1982; doi:10.5194/nhess-21-1971-2021

Abstract:
We present an investigation of the disturbance history of an old-growth Douglas-fir (Pseudotsuga menziesii) stand in South Beach, Oregon, for possible growth changes due to tsunami inundation caused by the 1700 CE Cascadia Subduction Zone (CSZ) earthquake. A high-resolution model of the 1700 tsunami run-up heights at South Beach, assuming an “L”-sized earthquake, is also presented to better estimate the inundation levels several kilometers inland at the old-growth site. This tsunami model indicates the South Beach fir stand would have been subjected to local inundation depths from 0 to 10 m. Growth chronologies collected from the Douglas-fir stand shows that trees experienced a significant growth reductions in the year 1700 relative to nearby Douglas-fir stands, consistent with the tsunami inundation estimates. The ±1–3-year timing of the South Beach disturbances are also consistent with disturbances previously observed at a Washington state coastal forest ∼220 km to the north. Moreover, the 1700 South Beach growth reductions were not the largest over the >321-year tree chronology at this location, with other disturbances likely caused by climate drivers (e.g., drought or windstorms). Our study represents a first step in using tree growth history to ground truth tsunami inundation models by providing site-specific physical evidence.
Natural Hazards and Earth System Sciences, Volume 21, pp 1955-1969; doi:10.5194/nhess-21-1955-2021

Abstract:
Floods are one of the most frequent and damaging natural threats worldwide. Whereas the assessment of direct impacts is well advanced, the evaluation of indirect impacts is less frequently achieved. Indirect impacts are not due to the physical contact with flood water but result, for example, from the reduced performance of infrastructures. Linear critical infrastructures (such as roads and pipes) have an interconnected nature that may lead to failure propagation, so that impacts extend far beyond the inundated areas and/or period. This work presents the risk analysis of two linear infrastructure systems, i.e. the water distribution system (WSS) and the road network system. The evaluation of indirect flood impacts on the two networks is carried out for four flooding scenarios, obtained by a coupled 1D–quasi-2D hydraulic model. Two methods are used for assessing the impacts on the WSS and on the road network: a pressure-driven demand network model and a transport network disruption model respectively. The analysis is focused on the identification of (i) common impact metrics, (ii) vulnerable elements exposed to the flood, (iii) similarities and differences of the methodological aspects for the two networks, and (iv) risks due to systemic interdependency. The study presents an application to the metropolitan area of Florence (Italy). When interdependencies are accounted for, results showed that the risk to the WSS in terms of population equivalent (PE/year) can be reduced by 71.5 % and 41.8 %, if timely repairs to the WSS stations are accomplished by 60 and 120 min respectively; the risk to WSS in terms of pipe length (km yr−1) reduces by 53.1 % and 15.6 %. The study highlights that resilience is enhanced by systemic risk-informed planning, which ensures timely interventions on critical infrastructures; however, for indirect impacts and cascade effects, temporal and spatial scales are difficult to define. Perspective research could further improve this work by applying a system-risk analysis to multiple urban infrastructures.
Natural Hazards and Earth System Sciences, Volume 21, pp 1935-1954; doi:10.5194/nhess-21-1935-2021

Abstract:
Risk assessment constitutes the first part within the risk management framework and involves evaluating the importance of a risk, either quantitatively or qualitatively. Risk assessment consists of three steps, namely risk identification, risk estimation and risk evaluation. Nevertheless, the risk management framework also includes a fourth step, i.e., the need for feedback on all the risk assessment undertakings. However, there is a lack of such feedback, which constitutes a serious deficiency in the reduction of environmental hazards at the present time. Risk identification of local or regional hazards involves hazard quantification, event monitoring including early warning systems and statistical inference. Risk identification also involves the development of a database where historical hazard information and hazard effects are included. Similarly, risk estimation involves magnitude–frequency relationships and hazard economic costs. Furthermore, risk evaluation consists of the social consequences of the derived risk and involves cost-benefit analysis and community policy. The objective of this review paper is twofold. On the one hand, it is to address meteorological hazards and extremes within the risk management framework. Analysis results and case studies over Mediterranean ecosystems with emphasis on the wider area of Greece, in the eastern Mediterranean, are presented for each of the three steps of risk assessment for several environmental hazards. The results indicate that the risk management framework constitutes an integrated approach for environmental planning and decision-making. On the other hand, it sheds light on advances and current trends in the considered meteorological and environmental hazards and extreme events, such as tornadoes, waterspouts, hailstorms, heat waves, droughts, floods, heavy convective precipitation, landslides and wildfires, using recorded datasets, model simulations and innovative methodologies.
, , Stephane Pedeboy, Dustin Hill, Marcelo Saba, Hugh Hunt, Lukas Schwalt, Christian Vergeiner, Carlos T. Mata, Carina Schumann, et al.
Natural Hazards and Earth System Sciences, Volume 21, pp 1909-1919; doi:10.5194/nhess-21-1909-2021

Abstract:
Information about lightning properties is important in order to advance the current understanding of lightning, whereby the characteristics of ground strike points (GSPs) are in particular helpful to improving the risk estimation for lightning protection. Lightning properties of a total of 1174 negative downward lightning flashes are analyzed. The high-speed video recordings are taken in different regions, including Austria, Brazil, South Africa and the USA, and are analyzed in terms of flash multiplicity, duration, interstroke intervals and ground strike point properties. According to our knowledge this is the first simultaneous analysis of GSP properties in different regions of the world applying a common methodology. Although the results vary among the data sets, the analysis reveals that a third of the flashes are single-stroke events, while the overall mean number of strokes per flash equals 3.67. From the video imagery an average of 1.56 GSPs per flash is derived, with about 60 % of the multiple-stroke flashes striking the ground in more than one place. It follows that a ground contact point is struck 2.35 times on average. Multiple-stroke flashes last on average 371 ms, whereas the geometric mean (GM) interstroke interval value preceding strokes producing a new GSP is about 18 % greater than the GM value preceding subsequent strokes following a pre-existing lightning channel. In addition, a positive correlation between the duration and multiplicity of the flash is presented. The characteristics of the subset of flashes exhibiting multiple GSPs is further examined. It follows that strokes with a stroke order of 2 create a new GSP in 60 % of the cases, while this percentage quickly drops for higher-order strokes. Further, the possibility of forming a new lightning channel to ground in terms of the number of strokes that conditioned the previous lightning channel shows that approximately 88 % developed after the occurrence of only one stroke. Investigating the time intervals in the other 12 % of the cases when two or more strokes re-used the previous lightning channel showed that the average interstroke time interval preceding a new lightning channel is found to be more than twice the time difference between strokes that follow the previous lightning channel.
, , Stephane Pedeboy, Leandro Z. S. Campos, Michihiro Matsui, Dustin Hill, Marcelo Saba, Hugh Hunt
Natural Hazards and Earth System Sciences, Volume 21, pp 1921-1933; doi:10.5194/nhess-21-1921-2021

Abstract:
At present the lightning flash density is a key input parameter for assessing the risk of occurrence of a lightning strike in a particular region of interest. Since it is known that flashes tend to have more than one ground termination point on average, the use of ground strike point densities as opposed to flash densities is more appropriate. Lightning location systems (LLSs) do not directly provide ground strike point densities. However, ingesting their observations into an algorithm that groups strokes into respective ground strike points results in the sought-after density value. The aim of this study is to assess the ability of three distinct ground strike point algorithms to correctly determine the observed ground-truth strike points. The output of the algorithms is tested against a large set of ground-truth observations taken from different regions around the world, including Austria, Brazil, France, Spain, South Africa and the United States of America. These observations are linked to the observations made by a local LLS in order to retrieve the necessary parameters of each lightning discharge, which serve as input for the algorithms. Median values of the separation distance between the first stroke in the flash and subsequent ground strike points are found to vary between 1.3 and 2.75 km. It follows that all three of the algorithms perform well, with success rates of up to about 90 % to retrieve the correct type of the strokes in the flash, i.e., whether the stroke creates a new termination point or follows a pre-existing channel. The most important factor that influences the algorithms' performance is the accuracy by which the strokes are located by the LLS. Additionally, it is shown that the strokes' peak current plays an important role, whereby strokes with a larger absolute peak current have a higher probability of being correctly classified compared to the weaker strokes.
Natural Hazards and Earth System Sciences, Volume 21, pp 1867-1885; doi:10.5194/nhess-21-1867-2021

Abstract:
Climate models' outputs are affected by biases that need to be detected and adjusted to model climate impacts. Many climate hazards and climate-related impacts are associated with the interaction between multiple drivers, i.e. by compound events. So far climate model biases are typically assessed based on the hazard of interest, and it is unclear how much a potential bias in the dependence of the hazard drivers contributes to the overall bias and how the biases in the drivers interact. Here, based on copula theory, we develop a multivariate bias-assessment framework, which allows for disentangling the biases in hazard indicators in terms of the underlying univariate drivers and their statistical dependence. Based on this framework, we dissect biases in fire and heat stress hazards in a suite of global climate models by considering two simplified hazard indicators: the wet-bulb globe temperature (WBGT) and the Chandler burning index (CBI). Both indices solely rely on temperature and relative humidity. The spatial pattern of the hazard indicators is well represented by climate models. However, substantial biases exist in the representation of extreme conditions, especially in the CBI (spatial average of absolute bias: 21 ∘C) due to the biases driven by relative humidity (20 ∘C). Biases in WBGT (1.1 ∘C) are small compared to the biases driven by temperature (1.9 ∘C) and relative humidity (1.4 ∘C), as the two biases compensate for each other. In many regions, also biases related to the statistical dependence (0.85 ∘C) are important for WBGT, which indicates that well-designed physically based multivariate bias adjustment procedures should be considered for hazards and impacts that depend on multiple drivers. The proposed compound-event-oriented evaluation of climate model biases is easily applicable to other hazard types. Furthermore, it can contribute to improved present and future risk assessments through increasing our understanding of the biases' sources in the simulation of climate impacts.
, , Anawat Suppasri, , Kwanchai Pakoksung, David Lallemant, Susanna F. Jenkins, Ingrid Charvet, Terence Chua, Amanda Cheong, et al.
Natural Hazards and Earth System Sciences, Volume 21, pp 1887-1908; doi:10.5194/nhess-21-1887-2021

Abstract:
Modern tsunami events have highlighted the vulnerability of port structures to these high-impact but infrequent occurrences. However, port planning rarely includes adaptation measures to address tsunami hazards. The 2011 Tohoku tsunami presented us with an opportunity to characterise the vulnerability of port industries to tsunami impacts. Here, we provide a spatial assessment and photographic interpretation of freely available data sources. Approximately 5000 port structures were assessed for damage and stored in a database. Using the newly developed damage database, tsunami damage is quantified statistically for the first time, through the development of damage fragility functions for eight common port industries. In contrast to tsunami damage fragility functions produced for buildings from an existing damage database, our fragility functions showed higher prediction accuracies (up to 75 % accuracy). Pre-tsunami earthquake damage was also assessed in this study and was found to influence overall damage assessment. The damage database and fragility functions for port industries can inform structural improvements and mitigation plans for ports against future events.
Natural Hazards and Earth System Sciences, Volume 21, pp 1825-1845; doi:10.5194/nhess-21-1825-2021

Abstract:
Messages on social media can be an important source of information during crisis situations. They can frequently provide details about developments much faster than traditional sources (e.g., official news) and can offer personal perspectives on events, such as opinions or specific needs. In the future, these messages can also serve to assess disaster risks. One challenge for utilizing social media in crisis situations is the reliable detection of relevant messages in a flood of data. Researchers have started to look into this problem in recent years, beginning with crowdsourced methods. Lately, approaches have shifted towards an automatic analysis of messages. A major stumbling block here is the question of exactly what messages are considered relevant or informative, as this is dependent on the specific usage scenario and the role of the user in this scenario. In this review article, we present methods for the automatic detection of crisis-related messages (tweets) on Twitter. We start by showing the varying definitions of importance and relevance relating to disasters, leading into the concept of use case-dependent actionability that has recently become more popular and is the focal point of the review paper. This is followed by an overview of existing crisis-related social media data sets for evaluation and training purposes. We then compare approaches for solving the detection problem based (1) on filtering by characteristics like keywords and location, (2) on crowdsourcing, and (3) on machine learning technique. We analyze their suitability and limitations of the approaches with regards to actionability. We then point out particular challenges, such as the linguistic issues concerning social media data. Finally, we suggest future avenues of research and show connections to related tasks, such as the subsequent semantic classification of tweets.
Isidro Cantarino, Miguel Angel Carrion, Jose Sergio Palencia-Jimenez,
Natural Hazards and Earth System Sciences, Volume 21, pp 1847-1866; doi:10.5194/nhess-21-1847-2021

Abstract:
Urban expansion is a phenomenon that has been observed since the mid-20th century in more developed regions. One aspect of it is the urban development of holiday resorts with second homes that generally appeared following world political stabilisation. This residential expansion has often happened with scarce control, especially in its early stages, allowing areas to be occupied that are not so suitable in terms of the environment, culture and landscape, not to mention the very geological risks of flooding, earthquakes and landslides. Indeed, the risk of landslides for buildings occupying land in zones at such risk is not a matter solely attributable to the geomorphological characteristics of the land itself, nor is it simply a question of chance; it is also due to its management of such land, generally because of a lack of specific regulations. This study aims to lay down objective criteria to find how suitable a specific local entity's risk management is by looking at the evolution of its urban development procedures. It also aims to determine what causes the incidence of landslide risk (geomorphology, chance, land management, etc.) and finally to suggest control tools for the public bodies tasked with monitoring such matters.
, Hassan Ahmadul, Jonathan Patz,
Natural Hazards and Earth System Sciences, Volume 21, pp 1807-1823; doi:10.5194/nhess-21-1807-2021

Abstract:
Floods are the most common and damaging natural disaster in Bangladesh, and the effects of floods on public health have increased significantly in recent decades, particularly among lower socioeconomic populations. Assessments of social vulnerability on flood-induced health outcomes typically focus on local to regional scales; a notable gap remains in comprehensive, large-scale assessments that may foster disaster management practices. In this study, socioeconomic, health, and coping capacity vulnerability and composite social-health vulnerability are assessed using both equal-weight and principal-component approaches using 26 indicators across Bangladesh. Results indicate that vulnerable zones exist in the northwest riverine areas, northeast floodplains, and southwest region, potentially affecting 42 million people (26 % of the total population). Subsequently, the vulnerability measures are linked to flood forecast and satellite inundation information to evaluate their potential for predicting actual flood impact indices (distress, damage, disruption, and health) based on the immense August 2017 flood event. Overall, the forecast-based equally weighted vulnerability measures perform best. Specifically, socioeconomic and coping capacity vulnerability measures strongly align with the distress, disruption, and health impact records observed. Additionally, the forecast-based composite social-health vulnerability index also correlates well with the impact indices, illustrating its utility in identifying predominantly vulnerable regions. These findings suggest the benefits and practicality of this approach to assess both thematic and comprehensive spatial vulnerabilities, with the potential to support targeted and coordinated public disaster management and health practices.
Enrique Guillermo Cordaro, , David Laroze
Natural Hazards and Earth System Sciences, Volume 21, pp 1785-1806; doi:10.5194/nhess-21-1785-2021

Abstract:
Several magnetic measurements and theoretical developments from different research groups have shown certain relationships with worldwide geological processes. Secular variation in geomagnetic cutoff rigidity, magnetic frequencies, or magnetic anomalies have been linked with spatial properties of active convergent tectonic margins or earthquake occurrences during recent years. These include the rise in similar fundamental frequencies in the range of microhertz before the Maule 2010, Tōhoku 2011, and Sumatra–Andaman 2004 earthquakes and the dramatic rise in the cumulative number of magnetic anomalous peaks before several earthquakes such as Nepal 2015 and Mexico (Puebla) 2017. Currently, all of these measurements have been physically explained by the microcrack generation due to uniaxial stress change in rock experiments. The basic physics of these experiments have been used to describe the lithospheric behavior in the context of the seismo-electromagnetic theory. Due to the dramatic increase in experimental evidence, physical mechanisms, and the theoretical framework, this paper analyzes vertical magnetic behavior close to the three latest main earthquakes in Chile: Maule 2010 (Mw 8.8), Iquique 2014 (Mw 8.2), and Illapel 2015 (Mw 8.3). The fast Fourier transform (FFT), wavelet transform, and daily cumulative number of anomalies methods were used during quiet space weather time during 1 year before and after each earthquake in order to filter space influence. The FFT method confirms the rise in the power spectral density in the millihertz range 1 month before each earthquake, which decreases to lower values some months after earthquake occurrence. The cumulative anomaly method exhibited an increase prior to each Chilean earthquake (50–90 d prior to earthquakes) similar to those found for Nepal 2015 and Mexico 2017. The wavelet analyses also show similar properties to FFT analysis. However, the lack of physics-based constraints in the wavelet analysis does not allow conclusions that are as strong as those made by FFT and cumulative methods. By using these results and previous research, it could be stated that these magnetic features could give seismic information about impending events. Additionally, these results could be related to the lithosphere–atmosphere–ionosphere coupling (LAIC effect) and the growth of microcracks and electrification in rocks described by the seismo-electromagnetic theory.
, Kelley DePolt, Jamie Kruse, Anuradha Mukherji, , Ausmita Ghosh, Philip Van Wagoner
Natural Hazards and Earth System Sciences, Volume 21, pp 1759-1767; doi:10.5194/nhess-21-1759-2021

Abstract:
The simultaneous rise of tropical-cyclone-induced flood waters across a large hazard management domain can stretch rescue and recovery efforts. Here we present a means to quantify the connectedness of maximum surge during a storm with geospatial statistics. Tide gauges throughout the extensive estuaries and barrier islands of North Carolina deployed and operating during hurricanes Matthew (n=82) and Florence (n=123) are used to compare the spatial compounding of surge for these two disasters. Moran's I showed the occurrence of maximum storm tide was more clustered for Matthew compared to Florence, and a semivariogram analysis produced a spatial range of similarly timed storm tide that was 4 times as large for Matthew than Florence. A more limited data set of fluvial flooding and precipitation in eastern North Carolina showed a consistent result – multivariate flood sources associated with Matthew were more concentrated in time as compared to Florence. Although Matthew and Florence were equally intense, they had very different tracks and speeds which influenced the timing of surge along the coast.
Natural Hazards and Earth System Sciences, Volume 21, pp 1769-1784; doi:10.5194/nhess-21-1769-2021

Abstract:
A rainfall threshold is a function of some rainfall quantities that provides the conditions beyond which the probability of debris-flow occurrence is considered significant. Many uncertainties may affect the thresholds calibration and, consequently, its robustness. This study aims to assess the uncertainty in the estimate of a rainfall threshold for stony debris flow based on the backward dynamical approach, an innovative method to compute the rainfall duration and averaged intensity strictly related to a measured debris flow. The uncertainty analysis is computed by performing two Monte Carlo cascade simulations: (i) to assess the variability in the event characteristics estimate due to the uncertainty in the backward dynamical approach parameters and data and (ii) to quantify the impact of this variability on the threshold calibration. The application of this procedure to a case study highlights that the variability in the event characteristics can be both low and high. Instead, the threshold coefficients have a low dispersion showing good robustness of the threshold estimate. Moreover, the results suggest that some event features are correlated with the variability of the rainfall event duration and intensity. The proposed method is suitable to analyse the uncertainty of other threshold calibration approaches.
Tigstu T. Dullo, George K. Darkwah, , , M. Bulbul Sharif, , , Sheikh Ghafoor,
Natural Hazards and Earth System Sciences, Volume 21, pp 1739-1757; doi:10.5194/nhess-21-1739-2021

Abstract:
This study evaluates the impact of potential future climate change on flood regimes, floodplain protection, and electricity infrastructures across the Conasauga River watershed in the southeastern United States through ensemble hydrodynamic inundation modeling. The ensemble streamflow scenarios were simulated by the Distributed Hydrology Soil Vegetation Model (DHSVM) driven by (1) 1981–2012 Daymet meteorological observations and (2) 11 sets of downscaled global climate models (GCMs) during the 1966–2005 historical and 2011–2050 future periods. Surface inundation was simulated using a GPU-accelerated Two-dimensional Runoff Inundation Toolkit for Operational Needs (TRITON) hydrodynamic model. A total of 9 out of the 11 GCMs exhibit an increase in the mean ensemble flood inundation areas. Moreover, at the 1 % annual exceedance probability level, the flood inundation frequency curves indicate a ∼ 16 km2 increase in floodplain area. The assessment also shows that even after flood-proofing, four of the substations could still be affected in the projected future period. The increase in floodplain area and substation vulnerability highlights the need to account for climate change in floodplain management. Overall, this study provides a proof-of-concept demonstration of how the computationally intensive hydrodynamic inundation modeling can be used to enhance flood frequency maps and vulnerability assessment under the changing climatic conditions.
Natural Hazards and Earth System Sciences, Volume 21, pp 1721-1738; doi:10.5194/nhess-21-1721-2021

Abstract:
Impacts upon vulnerable areas such as mountain ranges may become greater under a future scenario of adverse climatic conditions. In this sense, the concurrence of long dry spells and extremely hot temperatures can induce environmental risks such as wildfires, crop yield losses or other problems, the consequences of which could be much more serious than if these events were to occur separately in time (e.g. only long dry spells). The present study attempts to address recent and future changes in the following dimensions: duration (D), magnitude (M) and extreme magnitude (EM) of compound dry–hot events in the Pyrenees. The analysis focuses upon changes in the extremely long dry spells and extremely high temperatures that occur within these dry periods in order to estimate whether the internal structure of the compound event underwent a change in the observed period (1981–2015) and whether it will change in the future (2006–2100) under intermediate (RCP4.5, where RCP is representative concentration pathway) and high (RCP8.5) emission scenarios. To this end, we quantified the changes in the temporal trends of such events, as well as changes in the bivariate probability density functions for the main Pyrenean regions. The results showed that to date the risk of the compound event has increased by only one dimension – magnitude (including extreme magnitude) – during the last few decades. In relation to the future, increase in risk was found to be associated with an increase in both the magnitude and the duration (extremely long dry spells) of the compound event throughout the Pyrenees during the spring under RCP8.5 and in the northernmost part of this mountain range during summer under this same scenario.
, , , Saeed Moghimi, Edward Myers, Shachak Pe'Eri, Hao-Cheng Yu
Natural Hazards and Earth System Sciences, Volume 21, pp 1703-1719; doi:10.5194/nhess-21-1703-2021

Abstract:
We study the compound flooding processes that occurred in Hurricane Florence (2018), which was accompanied by heavy precipitation, using a 3D creek-to-ocean hydrodynamic model. We examine the important role played by barrier islands in the observed compound surges in the coastal watershed. Locally very high resolution is used in some watershed areas in order to resolve small features that turn out to be critical for capturing the observed high water marks locally. The wave effects are found to be significant near barrier islands and have contributed to some observed over-toppings and breaches. Results from sensitivity tests applying each of the three major forcing factors (oceanic, fluvial, and pluvial) separately are succinctly summarized in a “dominance map” that highlights significant compound effects in most of the affected coastal watersheds, estuaries, and back bays behind the barrier islands. Operational forecasts based on the current model are being set up at NOAA to help coastal resource and emergency managers with disaster planning and mitigation efforts.
, , Shigehiro Fujino
Natural Hazards and Earth System Sciences, Volume 21, pp 1667-1683; doi:10.5194/nhess-21-1667-2021

Abstract:
The 2004 Indian Ocean tsunami caused significant economic losses and a large number of fatalities in the coastal areas. The estimation of tsunami flow conditions using inverse models has become a fundamental aspect of disaster mitigation and management. Here, a case study involving the Phra Thong island, which was affected by the 2004 Indian Ocean tsunami, in Thailand was conducted using inverse modeling that incorporates a deep neural network (DNN). The DNN inverse analysis reconstructed the values of flow conditions such as maximum inundation distance, flow velocity and maximum flow depth, as well as the sediment concentration of five grain-size classes using the thickness and grain-size distribution of the tsunami deposit from the post-tsunami survey around Phra Thong island. The quantification of uncertainty was also reported using the jackknife method. Using other previous models applied to areas in and around Phra Thong island, the predicted flow conditions were compared with the reported observed values and simulated results. The estimated depositional characteristics such as volume per unit area and grain-size distribution were in line with the measured values from the field survey. These qualitative and quantitative comparisons demonstrated that the DNN inverse model is a potential tool for estimating the physical characteristics of modern tsunamis.
Natural Hazards and Earth System Sciences, Volume 21, pp 1685-1701; doi:10.5194/nhess-21-1685-2021

Abstract:
In this study we analyze drought features at the European level over the period 1901–2019 using three drought indices: the standardized precipitation index (SPI), the standardized precipitation evapotranspiration index (SPEI), and the self-calibrated Palmer drought severity index (scPDSI). The results based on the SPEI and scPDSI point to the fact that Central Europe (CEU) and the Mediterranean region (MED) are becoming dryer due to an increase in the potential evapotranspiration and mean air temperature, while North Europe (NEU) is becoming wetter. By contrast, the SPI drought does not reveal these changes in the drought variability, mainly due to the fact that the precipitation does not exhibit a significant change, especially over CEU. The SPEI12 indicates a significant increase both in the drought frequency and area over the last three decades for MED and CEU, while SPI12 does not capture these features. Thus, the performance of the SPI may be insufficient for drought analysis studies over regions where there is a strong warming signal. By analyzing the frequency of compound events (e.g., high temperatures and droughts), we show that the potential evapotranspiration and the mean air temperature are becoming essential components for drought occurrence over CEU and MED. This, together with the projected increase in the potential evapotranspiration under a warming climate, has significant implications concerning the future occurrence of drought events, especially for the MED and CEU regions.
, , , Eduardo Gutiérrez, Demetrio Escobar, Melida Schliz, Alessandro Aiuppa,
Natural Hazards and Earth System Sciences, Volume 21, pp 1639-1665; doi:10.5194/nhess-21-1639-2021

Abstract:
The San Salvador volcanic complex (El Salvador) and Nejapa-Chiltepe volcanic complex (Nicaragua) have been characterized by a significant variability in eruption style and vent location. Densely inhabited cities are built on them and their surroundings, including the metropolitan areas of San Salvador (∼2.4 million people) and Managua (∼1.4 million people), respectively. In this study we present novel vent opening probability maps for these volcanic complexes, which are based on a multi-model approach that relies on kernel density estimators. In particular, we present thematic vent opening maps, i.e., we consider different hazardous phenomena separately, including lava emission, small-scale pyroclastic density currents, ejection of ballistic projectiles, and low-intensity pyroclastic fallout. Our volcanological dataset includes: (1) the location of past vents, (2) the mapping of the main fault structures, and (3) the eruption styles of past events, obtained from critical analysis of the literature and/or inferred from volcanic deposits and morphological features observed remotely and in the field. To illustrate the effects of considering the expected eruption style in the construction of vent opening maps, we focus on the analysis of small-scale pyroclastic density currents derived from phreatomagmatic activity or from low-intensity magmatic volcanism. For the numerical simulation of these phenomena we adopted the recently developed branching energy cone model by using the program ECMapProb. Our results show that the implementation of thematic vent opening maps can produce significantly different hazard levels from those estimated with traditional, non-thematic maps.
Natural Hazards and Earth System Sciences, Volume 21, pp 1615-1637; doi:10.5194/nhess-21-1615-2021

Abstract:
Controls on landsliding have long been studied, but the potential for landslide-induced dam and lake formation has received less attention. Here, we model possible landslides and the formation of landslide dams and lakes in the Austrian Alps. We combine a slope criterion with a probabilistic approach to determine landslide release areas and volumes. We then simulate the progression and deposition of the landslides with a fluid dynamic model. We characterize the resulting landslide deposits with commonly used metrics, investigate their relation to glacial land-forming and tectonic units, and discuss the roles of the drainage system and valley shape. We discover that modeled landslide dams and lakes cover a wide volume range. In line with real-world inventories, we further found that lake volume increases linearly with landslide volume in the case of efficient damming – when an exceptionally large lake is dammed by a relatively small landslide deposit. The distribution and size of potential landslide dams and lakes depends strongly on local topographic relief. For a given landslide volume, lake size depends on drainage area and valley geometry. The largest lakes form in glacial troughs, while the most efficient damming occurs where landslides block a gorge downstream of a wide valley, a situation preferentially encountered at the transition between two different tectonic units. Our results also contain inefficient damming events, a damming type that exhibits different scaling of landslide and lake metrics than efficient damming and is hardly reported in inventories. We assume that such events also occur in the real world and emphasize that their documentation is needed to better understand the effects of landsliding on the drainage system.
Natural Hazards and Earth System Sciences, Volume 21, pp 1599-1614; doi:10.5194/nhess-21-1599-2021

Abstract:
Models for the predictions of monetary losses from floods mainly blend data deemed to represent a single flood type and region. Moreover, these approaches largely ignore indicators of preparedness and how predictors may vary between regions and events, challenging the transferability of flood loss models. We use a flood loss database of 1812 German flood-affected households to explore how Bayesian multilevel models can estimate normalised flood damage stratified by event, region, or flood process type. Multilevel models acknowledge natural groups in the data and allow each group to learn from others. We obtain posterior estimates that differ between flood types, with credibly varying influences of water depth, contamination, duration, implementation of property-level precautionary measures, insurance, and previous flood experience; these influences overlap across most events or regions, however. We infer that the underlying damaging processes of distinct flood types deserve further attention. Each reported flood loss and affected region involved mixed flood types, likely explaining the uncertainty in the coefficients. Our results emphasise the need to consider flood types as an important step towards applying flood loss models elsewhere. We argue that failing to do so may unduly generalise the model and systematically bias loss estimations from empirical data.
Feifei Shen, Aiqing Shu, Hong Li, , Jinzhong Min
Natural Hazards and Earth System Sciences, Volume 21, pp 1569-1582; doi:10.5194/nhess-21-1569-2021

Abstract:
Himawari-8 is a next-generation geostationary meteorological satellite launched by the Japan Meteorological Agency. It carries the Advanced Himawari Imager (AHI) on board, which can continuously monitor high-impact weather events with high frequency in space and time. The assimilation of AHI radiance data was implemented with the three-dimensional variational data assimilation system (3DVAR) of the Weather Research and Forecasting Model for the analysis and prediction of Typhoon Soudelor (2015) in the Pacific typhoon season. The effective assimilation of AHI radiance data in improving the forecast of the tropical cyclone during its rapid intensification has been realized. The results show that, after assimilating the AHI radiance data under clear-sky conditions, the typhoon position in the background field of the model was effectively corrected compared with the control experiment without AHI radiance data assimilation. It is found that the assimilation of AHI radiance data is able to improve the analyses of the water vapor and wind in a typhoon's inner-core region. The analyses and forecasts of the minimum sea level pressure, the maximum surface wind, and the track of the typhoon are further improved.
, , Baruch Ziv, Pavel Khain
Natural Hazards and Earth System Sciences, Volume 21, pp 1583-1597; doi:10.5194/nhess-21-1583-2021

Abstract:
The study deals with an intense rainstorm that hit the Middle East between 24 and 27 April 2018 and took the lives of 13 people, 10 of them on 26 April during the deadliest flash flood in Tzafit Basin (31.0∘ N, 35.3∘ E), the Negev Desert. The rainfall observed in the southern Negev was comparable to the long-term annual rainfall there, with intensities exceeding a 75-year return period. The timing of the storm, at the end of the rainy season when rain is relatively rare and spotty, raises the question of what the atmospheric conditions were that made this rainstorm one of the most severe late-spring storms. The synoptic background was an upper-level cut-off low that formed south of a blocking high which developed over eastern Europe. The cut-off low entered the Levant near 30∘ N latitude, slowed its movement from ∼10 to 1500 J kg−1, LI=4 K) and the precipitable water reaching 30 mm. The latter is explained by lower-level moisture advection from the Mediterranean and an additional contribution of mid-level moist air transport entering the region from the east. Three major rain centres were active over Israel during 26 April, only one of them was orographic and the other two were triggered by instability and mesoscale cyclonic centres. The build-up of the instability is explained by a negative upper-level temperature anomaly over the region caused by a northerly flow east of a blocking high that dominated eastern Europe and ground warming during several hours under clear skies. The intensity of this storm is attributed to an amplification of a mid-latitude disturbance which produced a cut-off low with its implied high relative vorticity, low upper-level temperatures and slow progression. All these, combined with the contribution of moisture supply, led to intense moist convection that prevailed over the region for 3 successive days.
, So Kazama, Daisuke Komori
Natural Hazards and Earth System Sciences, Volume 21, pp 1551-1567; doi:10.5194/nhess-21-1551-2021

Abstract:
In the past few decades, various natural hazards have occurred in Laos. To lower the consequences and losses caused by hazardous events, it is important to understand the magnitude of each hazard and the potential impact area. The main objective of this study was to propose a new approach to integrating hazard maps to detect hazardous areas on a national scale, for which area-limited data are available. The integrated hazard maps were based on a merging of five hazard maps: floods, land use changes, landslides, climate change impacts on floods, and climate change impacts on landslides. The integrated hazard map consists of six maps under three representative concentration pathway (RCP) scenarios and two time periods (near future and far future). The analytical hierarchy process (AHP) was used as a tool to combine the different hazard maps into an integrated hazard map. From the results, comparing the increase in the very high hazard area between the integrated hazard maps of the far future under the RCP2.6 and RCP4.5 scenarios, Khammouan Province has the highest increase (16.45 %). Additionally, the very high hazard area in Khammouan Province increased by approximately 12.47 % between the integrated hazard maps under the RCP4.5 and RCP8.5 scenarios of the far future. The integrated hazard maps can pinpoint the dangerous area through the whole country, and the map can be used as primary data for selected future development areas. There are some limitations of the AHP methodology, which supposes linear independence of alternatives and criteria.
, , , Mark Anthony M. Matera, Fibor J. Tan
Natural Hazards and Earth System Sciences, Volume 21, pp 1531-1550; doi:10.5194/nhess-21-1531-2021

Abstract:
In 2018 Typhoon Mangkhut (locally known as Typhoon Ompong) triggered thousands of landslides in the Itogon region of the Philippines. A landslide inventory of the affected region is compiled for the first time, comprising 1101 landslides over a 570 km2 area. The inventory is used to study the geomorphological characteristics and land cover more prone to landsliding as well as the hydrometeorological conditions that led to widespread failure. The results showed that landslides mostly occurred on grassland and wooded slopes of clay superficial geology, predominantly facing east-southeast. Rainfall (Integrated Multi-satellitE Retrievals for Global Precipitation Measurement, IMERG GPM) associated with Typhoon Mangkhut is compared with 33 high-intensity rainfall events that did not trigger regional landslide events in 2018. Results show that landslides occurred during high-intensity rainfall that coincided with the highest soil moisture values (estimated clays saturation point), according to Soil Moisture Active Passive level 4 (SMAP-L4) data. Our results demonstrate the potential of SMAP-L4 and GPM IMERG data for landslide hazard assessment and early warning where ground-based data are scarce. However, other rainfall events in the months leading up to Typhoon Mangkhut that had similar or higher rainfall intensities and also occurred when soils were saturated did not trigger widespread landsliding, highlighting the need for further research into the conditions that trigger landslides in typhoons.
Luana Lavagnoli Moreira, , Masato Kobiyama
Natural Hazards and Earth System Sciences, Volume 21, pp 1513-1530; doi:10.5194/nhess-21-1513-2021

Abstract:
Despite the increasing body of research on flood vulnerability, a review of the methods used in the construction of vulnerability indices is still missing. Here, we address this gap by providing a state-of-art account on flood vulnerability indices, highlighting worldwide trends and future research directions. A total of 95 peer-reviewed articles published between 2002–2019 were systematically analyzed. An exponential rise in research effort is demonstrated, with 80 % of the articles being published since 2015. The majority of these studies (62.1 %) focused on the neighborhood followed by the city scale (14.7 %). Min–max normalization (30.5 %), equal weighting (24.2 %), and linear aggregation (80.0 %) were the most common methods. With regard to the indicators used, a focus was given to socioeconomic aspects (e.g., population density, illiteracy rate, and gender), whilst components associated with the citizen's coping and adaptive capacity were slightly covered. Gaps in current research include a lack of sensitivity and uncertainty analyses (present in only 9.5 % and 3.2 % of papers, respectively), inadequate or inexistent validation of the results (present in 13.7 % of the studies), lack of transparency regarding the rationale for weighting and indicator selection, and use of static approaches, disregarding temporal dynamics. We discuss the challenges associated with these findings for the assessment of flood vulnerability and provide a research agenda for attending to these gaps. Overall, we argue that future research should be more theoretically grounded while, at the same time, considering validation and the dynamic aspects of vulnerability.
, Karl W. Wegmann
Natural Hazards and Earth System Sciences, Volume 21, pp 1495-1511; doi:10.5194/nhess-21-1495-2021

Abstract:
Modern satellite networks with rapid image acquisition cycles allow for near-real-time imaging of areas impacted by natural hazards such as mass wasting, flooding, and volcanic eruptions. Publicly accessible multi-spectral datasets (e.g., Landsat, Sentinel-2) are particularly helpful in analyzing the spatial extent of disturbances, however, the datasets are large and require intensive processing on high-powered computers by trained analysts. HazMapper is an open-access hazard mapping application developed in Google Earth Engine that allows users to derive map and GIS-based products from Sentinel or Landsat datasets without the time- and cost-intensive resources required for traditional analysis. The first iteration of HazMapper relies on a vegetation-based metric, the relative difference in the normalized difference vegetation index (rdNDVI), to identify areas on the landscape where vegetation was removed following a natural disaster. Because of the vegetation-based metric, the tool is typically not suitable for use in desert or polar regions. HazMapper is not a semi-automated routine but makes rapid and repeatable analysis and visualization feasible for both recent and historical natural disasters. Case studies are included for the identification of landslides and debris flows, wildfires, pyroclastic flows, and lava flow inundation. HazMapper is intended for use by both scientists and non-scientists, such as emergency managers and public safety decision-makers.
Kai Wan Yuen, Tang Thi Hanh, Vu Duong Quynh, , Paul Teng,
Natural Hazards and Earth System Sciences, Volume 21, pp 1473-1493; doi:10.5194/nhess-21-1473-2021

Abstract:
Vietnam is a major rice producer, and much of the rice grown is concentrated in the Red River Delta (RRD) and the Mekong River Delta (MRD). While the two deltas are highly productive regions, they are vulnerable to natural hazards and the effects of human-induced environmental change. To show that the processes and issues affecting food security are reinforcing, interdependent and operating at multiple scales, we used a systems-thinking approach to represent the major linkages between anthropogenic land-use and natural hazards and elaborate on how the drivers and environmental processes interact and influence rice growing area, rice yield and rice quality in the two deltas. On a local scale, demand for aquaculture and alternative crops, urban expansion, dike development, sand mining and groundwater extraction decrease rice production in the two deltas. Regionally, upstream dam construction impacts rice production in the two deltas despite being distally situated. Separately, the localized natural hazards that have adversely affected rice production include droughts, floods and typhoons. Outbreaks of pests and diseases are also common. Climate-change-induced sea level rise is a global phenomenon that will affect agricultural productivity. Notably, anthropogenic developments meant to improve agricultural productivity or increase economic growth can create many unwanted environmental consequences such as an increase in flooding, saltwater intrusion and land subsidence, which in turn decreases rice production and quality. In addition, natural hazards may amplify the problems created by human activities. Our meta-analysis highlights the ways in which a systems-thinking approach can yield more nuanced perspectives to tackle “wicked” and interrelated environmental challenges. Given that deltas worldwide are globally significant for food production and are highly stressed and degraded, a systems-thinking approach can be applied to provide a holistic and contextualized overview of the threats faced in each location.
Natural Hazards and Earth System Sciences, Volume 21, pp 1467-1471; doi:10.5194/nhess-21-1467-2021

, Ivan D. Haigh, Sylvie Parey,
Natural Hazards and Earth System Sciences, Volume 21, pp 1461-1465; doi:10.5194/nhess-21-1461-2021

Ali Rodríguez-Castellanos, , , Miguel A. Orellana, Alfredo Reyes-Salazar
Natural Hazards and Earth System Sciences, Volume 21, pp 1445-1460; doi:10.5194/nhess-21-1445-2021

Abstract:
For earthquake-resistant design, structural degradation is considered using traditional strength modification factors, which are obtained via the ratio of the nonlinear seismic response of degrading and non-degrading structural single-degree-of-freedom (SDOF) systems. In this paper, with the aim to avoid the nonlinear seismic response to compute strength modification factors, a methodology based on probabilistic seismic hazard analyses (PSHAs), is proposed in order to obtain strength modification factors of design spectra which consider structural degradation through the spectral-shape intensity measure INp. PSHAs using INp to account for structural degradation and Sa(T1), which represents the spectral acceleration associated with the fundamental period and does not consider such degradation, are performed. The ratio of the uniform hazard spectra in terms of INp and Sa(T1), which represent the response of degrading and non-degrading systems, provides new strength modification factors without the need to develop nonlinear time history analysis. A mathematical expression is fitted to the ratios that correspond to systems located in different soil types. The expression is validated by comparing the results with those derived from nonlinear time history analyses of structural systems.
, , Hansi Senaratne, Martin Potthast, , Benno Stein
Natural Hazards and Earth System Sciences, Volume 21, pp 1431-1444; doi:10.5194/nhess-21-1431-2021

Abstract:
Compiling and disseminating information about incidents and disasters are key to disaster management and relief. But due to inherent limitations of the acquisition process, the required information is often incomplete or missing altogether. To fill these gaps, citizen observations spread through social media are widely considered to be a promising source of relevant information, and many studies propose new methods to tap this resource. Yet, the overarching question of whether and under which circumstances social media can supply relevant information (both qualitatively and quantitatively) still remains unanswered. To shed some light on this question, we review 37 disaster and incident databases covering 27 incident types, compile a unified overview of the contained data and their collection processes, and identify the missing or incomplete information. The resulting data collection reveals six major use cases for social media analysis in incident data collection: (1) impact assessment and verification of model predictions, (2) narrative generation, (3) recruiting citizen volunteers, (4) supporting weakly institutionalized areas, (5) narrowing surveillance areas, and (6) reporting triggers for periodical surveillance. Furthermore, we discuss the benefits and shortcomings of using social media data for closing information gaps related to incidents and disasters.
Natural Hazards and Earth System Sciences, Volume 21, pp 1409-1429; doi:10.5194/nhess-21-1409-2021

Abstract:
Glacier detachments are a rare, but hazardous, phenomenon of glacier instability, whereof only a handful have been documented to date. Common to all known cases is that many million cubic meters of ice detached from the bed of relatively low-angle valley glaciers and turned into long-runout mass flows. Recently, two such detachments were observed in the Petra Pervogo range in Tajikistan. Using a variety of satellite imagery, including Landsat 1–8, Sentinel-2, ASTER, TanDEM-X, WorldView, and Keyhole, we characterized these events and identified in total 17 mass flows involving glacier ice (detachments, ice, and rock-ice avalanches; rock avalanches falling on glaciers) that clustered in four different catchments between 1973 and 2019. The runout distances range from 2 to 19 km, and the largest detached glacier volume was 8.8×106 m3. A total of 11 out of 13 detachments, ice, or rock-ice avalanches occurred between July and September in years with mean annual air temperatures above the trend of the past 46 years. The relatively large number of locally clustered events indicates that the Petra Pervogo range has particularly favorable conditions for glacier instabilities. The images and geology of the region suggest that easily erodible lithologies are widespread. These soft lithologies may be also one reason for the high density of surging glaciers in the Petra Pervogo range and the wider Pamir region. We conclude that high temperatures, combined with soft, fine-grained sediments, may increase the likelihood of mass wasting events and appear to be critical factors facilitating the detachment of entire valley glaciers, whereas such events appear to be relatively robust against earthquakes for our study area. The observed recurrence of mass wasting events make the Petra Pervogo range a potential candidate to witness glacier detachments by field studies.
, , , , Ladislava Řezníčková, Pavel Zahradníček,
Natural Hazards and Earth System Sciences, Volume 21, pp 1355-1382; doi:10.5194/nhess-21-1355-2021

Abstract:
This paper presents an analysis of fatalities attributable to weather conditions in the Czech Republic during the 2000–2019 period. The database of fatalities deployed contains information extracted from Právo, a leading daily newspaper, and Novinky.cz, its internet equivalent, supplemented by a number of other documentary sources. The analysis is performed for floods, windstorms, convective storms, rain, snow, glaze ice, frost, heat, and fog. For each of them, the associated fatalities are investigated in terms of annual frequencies, trends, annual variation, spatial distribution, cause, type, place, and time as well as the sex, age, and behaviour of casualties. There were 1164 weather-related fatalities during the 2000–2019 study period, exhibiting a statistically significant falling trend. Those attributable to frost (31 %) predominated, followed by glaze ice, rain, and snow. Fatalities were at their maximum in January and December and at their minimum in April and September. Fatalities arising out of vehicle accidents (48 %) predominated in terms of structure, followed by freezing or hypothermia (30 %). Most deaths occurred during the night. Adults (65 %) and males (72 %) accounted for the majority of fatalities, while indirect fatalities were more frequent than direct ones (55 % to 45 %). Hazardous behaviour accounted for 76 %. According to the database of the Czech Statistical Office, deaths caused by exposure to excessive natural cold are markedly predominant among five selected groups of weather-related fatalities, and their numbers exhibit a statistically significant rise during 2000–2019. Police yearbooks of the fatalities arising out of vehicle accidents indicate significantly decreasing trends in the frequency of inclement weather patterns associated with fatal accidents as well as a decrease in their percentage in annual numbers of fatalities. The discussion of results includes the problems of data uncertainty, comparison of different data sources, and the broader context.
, , , Norman Kerle, Jan Maarten Schraagen, Joanne Vinke-De Kruijf, Karst Geurs, Andreas Hartmann, ,
Natural Hazards and Earth System Sciences, Volume 21, pp 1383-1407; doi:10.5194/nhess-21-1383-2021

Abstract:
Infrastructure systems are inextricably tied to society by providing a variety of vital services. These systems play a fundamental role in reducing the vulnerability of communities and increasing their resilience to natural and human-induced hazards. While various definitions of resilience for infrastructure systems exist, analyzing the resilience of these systems within cross-sectoral and interdisciplinary perspectives remains limited and fragmented in research and practice. With the aim to assist researchers and practitioners in advancing understanding of resilience in designing infrastructure systems, this systematic literature review synthesizes and complements existing knowledge on designing resilient vital infrastructures by identifying (1) key conceptual tensions and challenges, (2) engineering and non-engineering measures, and (3) directions for future research. Here, a conceptual framework is developed in which infrastructures are defined as a conglomeration of interdependent social–ecological–technical systems. In addition, we define resilient infrastructures as systems with ability to (i) anticipate and absorb disturbances, (ii) adapt/transform in response to changes, (iii) recover, and (iv) learn from prior unforeseen events. Our results indicate that conceptual and practical challenges in designing resilient infrastructures continue to exist. Hence these systems are still being built without taking resilience explicitly into account. Our review of measures and recent applications shows that the available measures have not been widely applied in designing resilient infrastructure systems. Key concerns to address are identified as (i) the integration of social, ecological, and technical resilience of infrastructure systems with explicit attention paid to cascading effects and dependencies across these complex systems and (ii) the development of new technologies to identify factors that create different recovery characteristics.
Natural Hazards and Earth System Sciences, Volume 21, pp 1337-1354; doi:10.5194/nhess-21-1337-2021

Abstract:
Drought is understood as both a lack of water (i.e., a deficit compared to demand) and a temporal anomaly in one or more components of the hydrological cycle. Most drought indices, however, only consider the anomaly aspect, i.e., how unusual the condition is. In this paper, we present two drought hazard indices that reflect both the deficit and anomaly aspects. The soil moisture deficit anomaly index, SMDAI, is based on the drought severity index, DSI (Cammalleri et al., 2016), but is computed in a more straightforward way that does not require the definition of a mapping function. We propose a new indicator of drought hazard for water supply from rivers, the streamflow deficit anomaly index, QDAI, which takes into account the surface water demand of humans and freshwater biota. Both indices are computed and analyzed at the global scale, with a spatial resolution of roughly 50 km, for the period 1981–2010, using monthly time series of variables computed by the global water resources and the model WaterGAP 2.2d. We found that the SMDAI and QDAI values are broadly similar to values of purely anomaly-based indices. However, the deficit anomaly indices provide more differentiated spatial and temporal patterns that help to distinguish the degree and nature of the actual drought hazard to vegetation health or the water supply. QDAI can be made relevant for stakeholders with different perceptions about the importance of ecosystem protection, by adapting the approach for computing the amount of water that is required to remain in the river for the well-being of the river ecosystem. Both deficit anomaly indices are well suited for inclusion in local or global drought risk studies.
Zheng Liang, , Kai Feng
Natural Hazards and Earth System Sciences, Volume 21, pp 1323-1335; doi:10.5194/nhess-21-1323-2021

Abstract:
Monitoring drought and mastering the laws of drought propagation are the basis for regional drought prevention and resistance. Multivariate drought indicators considering meteorological, agricultural and hydrological information may fully describe drought conditions. However, series of hydrological variables in cold and arid regions that are too short or missing make it difficult to monitor drought. This paper proposed a method combining Soil and Water Assessment Tool (SWAT) and empirical Kendall distribution function (KC′) for drought monitoring. The SWAT model, based on the principle of runoff formation, was used to simulate the hydrological variables of the drought evolution process. Three univariate drought indexes, namely meteorological drought (standardized precipitation evapotranspiration index; SPEI), agricultural drought (standardized soil moisture index; SSI) and hydrological drought (standardized streamflow drought index; SDI), were constructed using a parametric or non-parametric method to analyze the propagation time from meteorological drought to agricultural drought and hydrological drought. The KC′ was used to build a multivariable comprehensive meteorology–agriculture–hydrology drought index (MAHDI) that integrated meteorological, agricultural and hydrological drought to analyze the characteristics of a comprehensive drought evolution. The Jinta River in the inland basin of northwestern China was used as the study area. The results showed that agricultural and hydrological drought had a seasonal lag time from meteorological drought. The degree of drought in this basin was high in the northern and low in the southern regions. MAHDI proved to be acceptable in that it was consistent with historical drought records, could catch drought conditions characterized by univariate drought indexes, and capture the occurrence and end of droughts. Nevertheless, its ability to characterize mild and moderate droughts was stronger than severe droughts. In addition, the comprehensive drought conditions showed insignificant aggravating trends in spring and summer and showed insignificant alleviating trends in autumn and winter and at annual scales. The results provided theoretical support for the drought monitoring in the Jinta River basin. This method provided the possibility for drought monitoring in other watersheds lacking measured data.
, Theodoros Economou
Natural Hazards and Earth System Sciences, Volume 21, pp 1313-1322; doi:10.5194/nhess-21-1313-2021

Abstract:
We use high-resolution (4.4 km) numerical simulations of tropical cyclones to produce exceedance probability estimates for extreme wind (gust) speeds over Bangladesh. For the first time, we estimate equivalent return periods up to and including a 1-in-200 year event, in a spatially coherent manner over all of Bangladesh, by using generalised additive models. We show that some northern provinces, up to 200 km inland, may experience conditions equal to or exceeding a very severe cyclonic storm event (maximum wind speeds in ≥64 kn) with a likelihood equal to coastal regions less than 50 km inland. For the most severe super cyclonic storm events (≥120 kn), event exceedance probabilities of 1-in-100 to 1-in-200 events remain limited to the coastlines of southern provinces only. We demonstrate how the Bayesian interpretation of the generalised additive model can facilitate a transparent decision-making framework for tropical cyclone warnings.
, , Raghavendra Ashrit, Barbara Casati, Jing Chen, Caio A. S. Coelho, , , Thomas Haiden, Stephanie Landman, et al.
Natural Hazards and Earth System Sciences, Volume 21, pp 1297-1312; doi:10.5194/nhess-21-1297-2021

Abstract:
Verification of forecasts and warnings of high-impact weather is needed by the meteorological centres, but how to perform it still presents many open questions, starting from which data are suitable as reference. This paper reviews new observations which can be considered for the verification of high-impact weather and provides advice for their usage in objective verification. Two high-impact weather phenomena are considered: thunderstorm and fog. First, a framework for the verification of high-impact weather is proposed, including the definition of forecast and observations in this context and creation of a verification set. Then, new observations showing a potential for the detection and quantification of high-impact weather are reviewed, including remote sensing datasets, products developed for nowcasting, datasets derived from telecommunication systems, data collected from citizens, reports of impacts and claim/damage reports from insurance companies. The observation characteristics which are relevant for their usage in forecast verification are also discussed. Examples of forecast evaluation and verification are then presented, highlighting the methods which can be adopted to address the issues posed by the usage of these non-conventional observations and objectively quantify the skill of a high-impact weather forecast.
Page of 69
Articles per Page
by
Show export options
  Select all
Back to Top Top