Climate change impacts water supply dynamics in the Upper Rio Grande (URG) watersheds of the US Southwest, where declining snowpack and altered snowmelt patterns have been observed. While temperature and precipitation effects on streamflow often receive the primary focus, other hydroclimate variables may provide more specific insight into runoff processes, especially at regional scales and in mountainous terrain where snowpack is a dominant water storage. The study addresses the gap by examining the mechanisms of generating streamflow through multi-modal inferences, coupling the Bayesian Information Criterion (BIC) and Bayesian Model Averaging (BMA) techniques. We identified significant streamflow predictors, exploring their relative influences over time and space across the URG watersheds. Additionally, the study compared the BIC-BMA-based regression model with Random Forest Regression (RFR), an ensemble Machine Learning (RFML) model, and validated them against unseen data. The study analyzed seasonal and long-term changes in streamflow generation mechanisms and identified emergent variables that influence streamflow. Moreover, monthly time series simulations assessed the overall prediction accuracy of the models. We evaluated the significance of the predictor variables in the proposed model and used the Gini feature importance within RFML to understand better the factors driving the influences. Results revealed that the hydroclimate drivers of streamflow exhibited temporal and spatial variability with significant lag effects. The findings also highlighted the diminishing influence of snow parameters (i. e., snow cover, snow depth, snow albedo) on streamflow while increasing soil moisture influence, particularly in downstream areas moving towards upstream or elevated watersheds. The evolving dynamics of snowmelt-runoff hydrology in this mountainous environment suggest a potential shift in streamflow generation pathways. The study contributes to the broader effort to elucidate the complex interplay between hydroclimate variables and streamflow dynamics, aiding in informed water resource management decisions.
The escalating global threat of forest fires, driven by global warming, requires the development of effective prediction systems to mitigate damages. This research focuses on Madhya Pradesh (MP) and Chhattisgarh (CG) states in central India, where forest fire risk has become particularly pronounced. The primary objectives of the study are to quantify and map the spatial and temporal dynamics of forest fires over the period 2001 to 2020, and to predict future fire risks using satellite derived datasets and machine learning techniques. Through a long-term analysis, the study revealed an alarming increase in the number of forest fire incidents in MP and CG. From an average of 1200 and 1000 during 2001 to 2005, the incidents increased to 2800 and 2100 during 2016 to 2020, in MP and CG respectively. To predict forest fire risk, Random Forest machine learning algorithm was adopted utilizing various satellite derived climatic, topographical, and ecological parameters such as temperature, precipitation, solar radiation, NDVI, soil moisture, litter availability, evapotranspiration and terrain parameters (at monthly scale for 20 years). While forecasting fire probability for 2018-2020, the model achieves high accuracy rate of 86.46 % in MP and 93.78 % in CG. The results highlight significant forest fire likelihood regions in the central MP and the Southern CG, identifying areas requiring enhanced fire management strategies. This study has revealed that NDVI and rainfall have played a positive role in restricting the forest fire, and their negative anomaly amplified the fire risk. The study would help forest planners and administrators to characterise vulnerable areas and prioritise their conservation provisions. (c) 2023 COSPAR. Published by Elsevier B.V. All rights reserved.