Hydrology and Earth System Sciences

Journal Information
ISSN / EISSN : 1027-5606 / 1607-7938
Published by: Copernicus GmbH (10.5194)
Total articles ≅ 4,959
Current Coverage
Archived in

Latest articles in this journal

Linqi Zhang, , , , Ye Zhu, Linyong Wei, Linyan Zhang, Shanhu Jiang, Xiaoli Yang, Xiuqin Fang, et al.
Hydrology and Earth System Sciences, Volume 26, pp 3241-3261; https://doi.org/10.5194/hess-26-3241-2022

The term “flash drought” describes a type of drought with rapid onset and strong intensity, which is co-affected by both water-limited and energy-limited conditions. It has aroused widespread attention in related research communities due to its devastating impacts on agricultural production and natural systems. Based on a global reanalysis dataset, we identify flash droughts across China during 1979–2016 by focusing on the depletion rate of weekly soil moisture percentile. The relationship between the rate of intensification (RI) and nine related climate variables is constructed using three machine learning (ML) technologies, namely, multiple linear regression (MLR), long short-term memory (LSTM), and random forest (RF) models. On this basis, the capabilities of these algorithms in estimating RI and detecting droughts (flash droughts and traditional slowly evolving droughts) were analyzed. Results showed that the RF model achieved the highest skill in terms of RI estimation and flash drought identification among the three approaches. Spatially, the RF-based RI performed best in southeastern China, with an average CC of 0.90 and average RMSE of the 2.6 percentile per week, while poor performances were found in the Xinjiang region. For drought detection, all three ML technologies presented a better performance in monitoring flash droughts than in conventional slowly evolving droughts. Particularly, the probability of detection (POD), false alarm ratio (FAR), and critical success index (CSI) of flash drought derived from RF were 0.93, 0.15, and 0.80, respectively, indicating that RF technology is preferable in estimating the RI and monitoring flash droughts by considering multiple meteorological variable anomalies in adjacent weeks to drought onset. In terms of the meteorological driving mechanism of flash drought, the negative precipitation (P) anomalies and positive potential evapotranspiration (PET) anomalies exhibited a stronger synergistic effect on flash droughts compared to slowly developing droughts, along with asymmetrical compound influences in different regions of China. For the Xinjiang region, P deficit played a dominant role in triggering the onset of flash droughts, while in southwestern China, the lack of precipitation and enhanced evaporative demand almost contributed equally to the occurrence of flash drought. This study is valuable to enhance the understanding of flash droughts and highlight the potential of ML technologies in flash drought monitoring.
, Thi Thanh Luong, , Thomas Grünwald, Christian Bernhofer
Hydrology and Earth System Sciences, Volume 26, pp 3177-3239; https://doi.org/10.5194/hess-26-3177-2022

Evaporation plays an important role in the water balance on a different spatial scale. However, its direct and indirect measurements are globally scarce and accurate estimations are a challenging task. Thus the correct process approximation in modelling of the terrestrial evaporation plays a crucial part. A physically based 1D lumped soil–plant–atmosphere model (BROOK90) is applied to study the role of parameter selection and meteorological input for modelled evaporation on the point scale. Then, with the integration of the model into global, regional and local frameworks, we made cross-combinations out of their parameterization and forcing schemes to show and analyse their roles in the estimations of the evaporation. Five sites with different land uses (grassland, cropland, deciduous broadleaf forest, two evergreen needleleaf forests) located in Saxony, Germany, were selected for the study. All tested combinations showed a good agreement with FLUXNET measurements (Kling–Gupta efficiency, KGE, values 0.35–0.80 for a daily scale). For most of the sites, the best results were found for the calibrated model with in situ meteorological input data, while the worst was observed for the global setup. The setups' performance in the vegetation period was much higher than for the winter period. Among the tested setups, the model parameterization showed higher spread in performance than meteorological forcings for fields and evergreen forests sites, while the opposite was noticed in deciduous forests. Analysis of the of evaporation components revealed that transpiration dominates (up to 65 %–75 %) in the vegetation period, while interception (in forests) and soil/snow evaporation (in fields) prevail in the winter months. Finally, it was found that different parameter sets impact model performance and redistribution of evaporation components throughout the whole year, while the influence of meteorological forcing was evident only in summer months.
, Olivier Eiff, , Ulrike Scherer, Jan Wienhöfer, Erwin Zehe
Hydrology and Earth System Sciences, Volume 26, pp 3125-3150; https://doi.org/10.5194/hess-26-3125-2022

Recent research explored an alternative energy-centred perspective on hydrological processes, extending beyond the classical analysis of the catchment's water balance. Particularly, streamflow and the structure of river networks have been analysed in an energy-centred framework, which allows for the incorporation of two additional physical laws: (1) energy is conserved and (2) entropy of an isolated system cannot decrease (first and second law of thermodynamics). This is helpful for understanding the self-organized geometry of river networks and open-catchment systems in general. Here we expand this perspective, by exploring how hillslope topography and the presence of rill networks control the free-energy balance of surface runoff at the hillslope scale. Special emphasis is on the transitions between laminar-, mixed- and turbulent-flow conditions of surface runoff, as they are associated with kinetic energy dissipation as well as with energy transfer to eroded sediments. Starting with a general thermodynamic framework, in a first step we analyse how typical topographic shapes of hillslopes, representing different morphological stages, control the spatial patterns of potential and kinetic energy of surface runoff and energy dissipation along the flow path during steady states. Interestingly, we find that a distinct maximum in potential energy of surface runoff emerges along the flow path, which separates upslope areas of downslope potential energy growth from downslope areas where potential energy declines. A comparison with associated erosion processes indicates that the location of this maximum depends on the relative influence of diffusive and advective flow and erosion processes. In a next step, we use this framework to analyse the energy balance of surface runoff observed during hillslope-scale rainfall simulation experiments, which provide separate measurements of flow velocities for rill and for sheet flow. To this end, we calibrate the physically based hydrological model Catflow, which distributes total surface runoff between a rill and a sheet flow domain, to these experiments and analyse the spatial patterns of potential energy, kinetic energy and dissipation. This reveals again the existence of a maximum of potential energy in surface runoff as well as a connection to the relative contribution of advective and diffusive processes. In the case of a strong rill flow component, the potential energy maximum is located close to the transition zone, where turbulence or at least mixed flow may emerge. Furthermore, the simulations indicate an almost equal partitioning of kinetic energy into the sheet and the rill flow component. When drawing the analogy to an electric circuit, this distribution of power and erosive forces to erode and transport sediment corresponds to a maximum power configuration.
, Simon J. Dadson, Douglas B. Clark, , Garry D. Hayman, Dai Yamazaki, Olivia R. E. Becher, , Catherine Prigent, Carlos Jiménez
Hydrology and Earth System Sciences, Volume 26, pp 3151-3175; https://doi.org/10.5194/hess-26-3151-2022

Wetlands play a key role in hydrological and biogeochemical cycles and provide multiple ecosystem services to society. However, reliable data on the extent of global inundated areas and the magnitude of their contribution to local hydrological dynamics remain surprisingly uncertain. Global hydrological models and land surface models (LSMs) include only the most major inundation sources and mechanisms; therefore, quantifying the uncertainties in available data sources remains a challenge. We address these problems by taking a leading global data product on inundation extents (Global Inundation Extent from Multi-Satellites, GIEMS) and matching against predictions from a global hydrodynamic model (Catchment-based Macro-scale Floodplain – CaMa-Flood) driven by runoff data generated by a land surface model (Joint UK Land and Environment Simulator, JULES). The ability of the model to reproduce patterns and dynamics shown by the observational product is assessed in a number of case studies across the tropics, which show that it performs well in large wetland regions, with a good match between corresponding seasonal cycles. At a finer spatial scale, we found that water inputs (e.g. groundwater inflow to wetland) became underestimated in comparison to water outputs (e.g. infiltration and evaporation from wetland) in some wetlands (e.g. Sudd, Tonlé Sap), and the opposite occurred in others (e.g. Okavango) in our model predictions. We also found evidence for an underestimation of low levels of inundation in our satellite-based inundation data (approx. 10 % of total inundation may not be recorded). Additionally, some wetlands display a clear spatial displacement between observed and simulated inundation as a result of overestimation or underestimation of overbank flooding upstream. This study provides timely information on inherent biases in inundation prediction and observation that can contribute to our current ability to make critical predictions of inundation events at both regional and global levels.
Hydrology and Earth System Sciences, Volume 26, pp 3037-3054; https://doi.org/10.5194/hess-26-3037-2022

Mountain seasonal snow cover is undergoing major changes due to global climate change. Assessments of future snow cover usually rely on physically based models, and often include post-processed meteorology. Alternatively, we here propose a direct statistical adjustment of snow cover fraction from regional climate models by using long-term remote-sensing observations. We compared different bias-adjustment routines (delta change, quantile mapping, and quantile delta mapping) and explored a downscaling based on historical observations for the Greater Alpine Region in Europe. All bias-adjustment methods account for systematic biases, for example due to topographic smoothing, and reduce model spread in future projections. The trend-preserving methods delta change and quantile delta mapping were found to be more suitable for snow cover fraction than quantile mapping. Averaged over the study region and whole year, snow cover fraction decreases from 12.5 % in 2001–2020 to 10.4 % (8.9 %, 11.5 %; model spread) in 2071–2100 under RCP2.6 (representative concentration pathway), and to 6.4 % (4.1 %, 7.8 %) under RCP8.5 (bias-adjusted estimates from quantile delta mapping). In addition, changes strongly depended on season and elevation. The comparison of the statistical downscaling to a high-resolution physically based model yields similar results for the elevation range covered by the climate models, but different elevation gradients of change above and below. Downscaling showed overall potential but requires further research. Since climate model and remote-sensing observations are available globally, the proposed methods are potentially widely applicable but are limited to snow cover fraction.
, Steven Reece, , , , Jens De Bruijn, , , , Simon J. Dadson
Hydrology and Earth System Sciences, Volume 26, pp 3079-3101; https://doi.org/10.5194/hess-26-3079-2022

Neural networks have been shown to be extremely effective rainfall-runoff models, where the river discharge is predicted from meteorological inputs. However, the question remains: what have these models learned? Is it possible to extract information about the learned relationships that map inputs to outputs, and do these mappings represent known hydrological concepts? Small-scale experiments have demonstrated that the internal states of long short-term memory networks (LSTMs), a particular neural network architecture predisposed to hydrological modelling, can be interpreted. By extracting the tensors which represent the learned translation from inputs (precipitation, temperature, and potential evapotranspiration) to outputs (discharge), this research seeks to understand what information the LSTM captures about the hydrological system. We assess the hypothesis that the LSTM replicates real-world processes and that we can extract information about these processes from the internal states of the LSTM. We examine the cell-state vector, which represents the memory of the LSTM, and explore the ways in which the LSTM learns to reproduce stores of water, such as soil moisture and snow cover. We use a simple regression approach to map the LSTM state vector to our target stores (soil moisture and snow). Good correlations (R2>0.8) between the probe outputs and the target variables of interest provide evidence that the LSTM contains information that reflects known hydrological processes comparable with the concept of variable-capacity soil moisture stores. The implications of this study are threefold: (1) LSTMs reproduce known hydrological processes. (2) While conceptual models have theoretical assumptions embedded in the model a priori, the LSTM derives these from the data. These learned representations are interpretable by scientists. (3) LSTMs can be used to gain an estimate of intermediate stores of water such as soil moisture. While machine learning interpretability is still a nascent field and our approach reflects a simple technique for exploring what the model has learned, the results are robust to different initial conditions and to a variety of benchmarking experiments. We therefore argue that deep learning approaches can be used to advance our scientific goals as well as our predictive goals.
, François Clayer, Sigrid Haande, , S. Jannicke Moe
Hydrology and Earth System Sciences, Volume 26, pp 3103-3124; https://doi.org/10.5194/hess-26-3103-2022

Freshwater management is challenging, and advance warning that poor water quality was likely, a season ahead, could allow for preventative measures to be put in place. To this end, we developed a Bayesian network (BN) for seasonal lake water quality prediction. BNs have become popular in recent years, but the vast majority are discrete. Here, we developed a Gaussian Bayesian network (GBN), a simple class of continuous BN. The aim was to forecast, in spring, mean total phosphorus (TP) and chlorophyll a (chl a) concentration, mean water colour, and maximum cyanobacteria biovolume for the upcoming growing season (May–October) in Vansjø, a shallow nutrient-rich lake in southeastern Norway. To develop the model, we first identified controls on interannual variability in seasonally aggregated water quality. These variables were then included in a GBN, and conditional probability densities were fit using observations (≤39 years). GBN predictions had R2 values of 0.37 (chl a) to 0.75 (colour) and classification errors of 32 % (TP) to 17 % (cyanobacteria). For all but lake colour, including weather variables did not improve the predictive performance (assessed through cross-validation). Overall, we found the GBN approach to be well suited to seasonal water quality forecasting. It was straightforward to produce probabilistic predictions, including the probability of exceeding management-relevant thresholds. The GBN could be sensibly parameterised using only the observed data, despite the small dataset. Developing a comparable discrete BN was much more subjective and time-consuming. Although low interannual variability and high temporal autocorrelation in the study lake meant the GBN performed only slightly better than a seasonal naïve forecast (where the forecasted value is simply the value observed the previous growing season), we believe that the forecasting approach presented here could be particularly useful in areas with higher sensitivity to catchment nutrient delivery and seasonal climate and for forecasting at shorter (daily or monthly) timescales. Despite the parametric constraints of GBNs, their simplicity, together with the relative accessibility of BN software with GBN handling, means they are a good first choice for BN development with continuous variables.
, , Carmen Krammer, , Kepeng Song, Yifan Li, Zhiqiang Zhang, ,
Hydrology and Earth System Sciences, Volume 26, pp 3021-3036; https://doi.org/10.5194/hess-26-3021-2022

Climate change and agricultural intensification are expected to increase soil erosion and sediment production from arable land in many regions. However, to date, most studies have been based on short-term monitoring and/or modeling, making it difficult to assess their reliability in terms of estimating long-term changes. We present the results of a unique data set consisting of measurements of sediment loads from a 60 ha catchment – the Hydrological Open Air Laboratory (HOAL) – in Petzenkirchen, Austria, which was observed periodically over a time period spanning 72 years. Specifically, we compare Period I (1946–1954) and Period II (2002–2017) by fitting sediment rating curves (SRCs) for the growth and dormant seasons for each of the periods. The results suggest a significant increase in sediment loads from Period I to Period II, with an average of 5.8 ± 3.8 to 60.0 ± 140.0 t yr−1. The sediment flux changed mainly due to a shift in the SRCs, given that the mean daily discharge significantly decreased from 5.0 ± 14.5 L s−1 for Period I to 3.8 ± 6.6 L s−1 for Period II. The slopes of the SRCs for the growing season and the dormant season of Period I were 0.3 and 0.8, respectively, whereas they were 1.6 and 1.7 for Period II, respectively. Climate change, considered in terms of rainfall erosivity, was not responsible for this shift, because erosivity decreased by 30.4 % from the dormant season of Period I to that of Period II, and no significant difference was found between the growing seasons of periods I and II. However, the change in sediment flux can be explained by land use and land cover change (LUCC) and the change in land structure (i.e., the organization of land parcels). Under low- and median-streamflow conditions, the land structure in Period II (i.e., the parcel effect) had no apparent influence on sediment yield. With increasing streamflow, it became more important in controlling sediment yield, as a result of an enhanced sediment connectivity in the landscape, leading to a dominant role under high-flow conditions. The increase in crops that make the landscape prone to erosion and the change in land uses between periods I and II led to an increase in sediment flux, although its relevance was surpassed by the effect of parcel structure change under high-flow conditions. We conclude that LUCC and land structure change should be accounted for when assessing sediment flux changes. Especially under high-flow conditions, land structure change substantially altered sediment fluxes, which is most relevant for long-term sediment loads and land degradation. Therefore, increased attention to improving land structure is needed in climate adaptation and agricultural catchment management.
Hydrology and Earth System Sciences, Volume 26, pp 3055-3077; https://doi.org/10.5194/hess-26-3055-2022

Given the importance of snow on different land and atmospheric processes, accurate representation of seasonal snow evolution, including distribution and melt volume, is highly imperative to any water resources development trajectories. The limitation of reliable snowmelt estimation in mountainous regions is, however, further exacerbated by data scarcity. This study attempts to develop relatively simple extended degree-day snow models driven by freely available snow-cover images. This approach offers relative simplicity and a plausible alternative to data-intensive models, as well as in situ measurements, and has a wide range of applicability, allowing for immediate verification with point measurements. The methodology employs readily available MODIS composite images to calibrate the snowmelt models on spatial snow distribution in contrast to the traditional snow-water-equivalent-based calibration. The spatial distribution of snow-cover is simulated using different extended degree-day models with parameters calibrated against individual MODIS snow-cover images for cloud-free days or a set of images representing a period within the snow season. The study was carried out in Baden-Württemberg (Germany) and in Switzerland. The simulated snow-cover data show very good agreement with MODIS snow-cover distribution, and the calibrated parameters exhibit relative stability across the time domain. Furthermore, different thresholds that demarcate snow and no-snow pixels for both observed and simulated snow cover were analyzed to evaluate these thresholds' influence on the model performance and identified for the study regions. The melt data from these calibrated snow models were used as standalone inputs to a modified Hydrologiska Byråns Vattenbalansavdelning (HBV) without the snow component in all the study catchments to assess the performance of the melt outputs in comparison to a calibrated standard HBV model. The results show an overall increase in Nash–Sutcliffe efficiency (NSE) performance and a reduction in uncertainty in terms of model performance. This can be attributed to the reduction in the number of parameters available for calibration in the modified HBV and an added reliability of the snow accumulation and melt processes inherent in the MODIS calibrated snow model output. This paper highlights that the calibration using readily available images used in this method allows for a flexible regional calibration of snow-cover distribution in mountainous areas with reasonably accurate precipitation and temperature data and globally available inputs. Likewise, the study concludes that simpler specific alterations to processes contributing to snowmelt can contribute to reliably identify the snow distribution and bring about improvements in hydrological simulations, owing to better representation of the snow processes in snow-dominated regimes.
, Christopher Barnard, , , Toni Jurlina, Cinzia Mazzetti,
Hydrology and Earth System Sciences, Volume 26, pp 2939-2968; https://doi.org/10.5194/hess-26-2939-2022

Streamflow forecasts provide vital information to aid emergency response preparedness and disaster risk reduction. Medium-range forecasts are created by forcing a hydrological model with output from numerical weather prediction systems. Uncertainties are unavoidably introduced throughout the system and can reduce the skill of the streamflow forecasts. Post-processing is a method used to quantify and reduce the overall uncertainties in order to improve the usefulness of the forecasts. The post-processing method that is used within the operational European Flood Awareness System is based on the model conditional processor and the ensemble model output statistics method. Using 2 years of reforecasts with daily timesteps, this method is evaluated for 522 stations across Europe. Post-processing was found to increase the skill of the forecasts at the majority of stations in terms of both the accuracy of the forecast median and the reliability of the forecast probability distribution. This improvement is seen at all lead times (up to 15 d) but is largest at short lead times. The greatest improvement was seen in low-lying, large catchments with long response times, whereas for catchments at high elevation and with very short response times the forecasts often failed to capture the magnitude of peak flows. Additionally, the quality and length of the observational time series used in the offline calibration of the method were found to be important. This evaluation of the post-processing method, and specifically the new information provided on characteristics that affect the performance of the method, will aid end users in making more informed decisions. It also highlights the potential issues that may be encountered when developing new post-processing methods.
Back to Top Top