Stephanie R. Clark, Guobin Fu, Sreekanth Janardhanan
{"title":"用于解释地下水时空预测的可解释人工智能","authors":"Stephanie R. Clark, Guobin Fu, Sreekanth Janardhanan","doi":"10.1029/2025wr041303","DOIUrl":null,"url":null,"abstract":"As machine learning models become more widely relied on for groundwater predictions, the ability to interpret and explain these predictions is increasingly important. Explainable AI (XAI) tools are addressing this challenge by enhancing model transparency. Importantly, XAI also offers an early indication of its potential in broadening the role of machine learning in groundwater research — shifting it from a predictive tool to one that deepens understanding of system dynamics. This study explores the capacity of XAI to provide comprehensive insights into groundwater system behavior over large geographic scales. Spatiotemporal variations in groundwater levels and trends across Australia's Murray-Darling Basin (MDB) are investigated. Predominant drivers of groundwater changes are identified, revealing differences across subregions and extended timeframes, including during periods of drought. Insights are revealed on a geographic scale that would be difficult to obtain using physics-based or conceptual models, though the approach is equally applicable to surrogates and emulators of these models. This framework advances the interpretability of spatiotemporal environmental predictions through the incorporation of machine learning with explainability and visualisations—demonstrating the potential for machine learning to add value in hydrological research beyond the production of accurate predictions. Although the application of explainability in hydrological machine learning models is still relatively new, it is poised to become a standard component of future analyses. Through the considered adaptation of XAI methods to hydrological settings, researchers will enhance the acceptance and applicability of machine learning models for sustainable water resource management.","PeriodicalId":23799,"journal":{"name":"Water Resources Research","volume":"26 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2025-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Explainable AI for Interpreting Spatiotemporal Groundwater Predictions\",\"authors\":\"Stephanie R. Clark, Guobin Fu, Sreekanth Janardhanan\",\"doi\":\"10.1029/2025wr041303\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As machine learning models become more widely relied on for groundwater predictions, the ability to interpret and explain these predictions is increasingly important. Explainable AI (XAI) tools are addressing this challenge by enhancing model transparency. Importantly, XAI also offers an early indication of its potential in broadening the role of machine learning in groundwater research — shifting it from a predictive tool to one that deepens understanding of system dynamics. This study explores the capacity of XAI to provide comprehensive insights into groundwater system behavior over large geographic scales. Spatiotemporal variations in groundwater levels and trends across Australia's Murray-Darling Basin (MDB) are investigated. Predominant drivers of groundwater changes are identified, revealing differences across subregions and extended timeframes, including during periods of drought. Insights are revealed on a geographic scale that would be difficult to obtain using physics-based or conceptual models, though the approach is equally applicable to surrogates and emulators of these models. This framework advances the interpretability of spatiotemporal environmental predictions through the incorporation of machine learning with explainability and visualisations—demonstrating the potential for machine learning to add value in hydrological research beyond the production of accurate predictions. Although the application of explainability in hydrological machine learning models is still relatively new, it is poised to become a standard component of future analyses. Through the considered adaptation of XAI methods to hydrological settings, researchers will enhance the acceptance and applicability of machine learning models for sustainable water resource management.\",\"PeriodicalId\":23799,\"journal\":{\"name\":\"Water Resources Research\",\"volume\":\"26 1\",\"pages\":\"\"},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2025-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Water Resources Research\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://doi.org/10.1029/2025wr041303\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Water Resources Research","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1029/2025wr041303","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
Explainable AI for Interpreting Spatiotemporal Groundwater Predictions
As machine learning models become more widely relied on for groundwater predictions, the ability to interpret and explain these predictions is increasingly important. Explainable AI (XAI) tools are addressing this challenge by enhancing model transparency. Importantly, XAI also offers an early indication of its potential in broadening the role of machine learning in groundwater research — shifting it from a predictive tool to one that deepens understanding of system dynamics. This study explores the capacity of XAI to provide comprehensive insights into groundwater system behavior over large geographic scales. Spatiotemporal variations in groundwater levels and trends across Australia's Murray-Darling Basin (MDB) are investigated. Predominant drivers of groundwater changes are identified, revealing differences across subregions and extended timeframes, including during periods of drought. Insights are revealed on a geographic scale that would be difficult to obtain using physics-based or conceptual models, though the approach is equally applicable to surrogates and emulators of these models. This framework advances the interpretability of spatiotemporal environmental predictions through the incorporation of machine learning with explainability and visualisations—demonstrating the potential for machine learning to add value in hydrological research beyond the production of accurate predictions. Although the application of explainability in hydrological machine learning models is still relatively new, it is poised to become a standard component of future analyses. Through the considered adaptation of XAI methods to hydrological settings, researchers will enhance the acceptance and applicability of machine learning models for sustainable water resource management.
期刊介绍:
Water Resources Research (WRR) is an interdisciplinary journal that focuses on hydrology and water resources. It publishes original research in the natural and social sciences of water. It emphasizes the role of water in the Earth system, including physical, chemical, biological, and ecological processes in water resources research and management, including social, policy, and public health implications. It encompasses observational, experimental, theoretical, analytical, numerical, and data-driven approaches that advance the science of water and its management. Submissions are evaluated for their novelty, accuracy, significance, and broader implications of the findings.