Interpretable SHAP Model Combining Meta-learning and Vision Transformer for Lithology Classification Using Limited and Unbalanced Drilling Data in Well Logging

IF 4.8 2区 地球科学 Q1 GEOSCIENCES, MULTIDISCIPLINARY
Youzhuang Sun, Shanchen Pang, Zhiyuan Zhao, Yongan Zhang
{"title":"Interpretable SHAP Model Combining Meta-learning and Vision Transformer for Lithology Classification Using Limited and Unbalanced Drilling Data in Well Logging","authors":"Youzhuang Sun, Shanchen Pang, Zhiyuan Zhao, Yongan Zhang","doi":"10.1007/s11053-024-10396-4","DOIUrl":null,"url":null,"abstract":"<p>Recent advances in geological exploration and oil and gas development have highlighted the critical need for accurate classification and prediction of subterranean lithologies. To address this, we introduce the Meta-Vision Transformer (Meta-ViT) method, a novel approach. This technique synergistically combines the adaptability of meta-learning with the analytical prowess of ViT. Meta-learning excels in identifying nuanced similarities across tasks, significantly enhancing learning efficiency. Simultaneously, the ViT leverages these meta-learning insights to navigate the complex landscape of geological exploration, improving lithology identification accuracy. The Meta-ViT model employs a support set to extract crucial insights through meta-learning, and a query set to test the generalizability of these insights. This dual-framework setup enables the ViT to detect various underground rock types with unprecedented precision. Additionally, by simulating diverse tasks and data scenarios, meta-learning broadens the model's applicational scope. Integrating the SHAP (SHapley Additive exPlanations) model, rooted in Shapley values from cooperative game theory, greatly enhances the interpretability of rock type classifications. We also utilized the ADASYN (Adaptive Synthetic Sampling) technique to optimize sample representation, generating new samples based on existing densities to ensure uniform distribution. Our extensive testing across various training and testing set ratios showed that the Meta-ViT model outperforms dramatically traditional machine learning models. This approach not only refines learning processes but it also adeptly addresses the inherent challenges of geological data analysis.</p>","PeriodicalId":54284,"journal":{"name":"Natural Resources Research","volume":null,"pages":null},"PeriodicalIF":4.8000,"publicationDate":"2024-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Natural Resources Research","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1007/s11053-024-10396-4","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOSCIENCES, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advances in geological exploration and oil and gas development have highlighted the critical need for accurate classification and prediction of subterranean lithologies. To address this, we introduce the Meta-Vision Transformer (Meta-ViT) method, a novel approach. This technique synergistically combines the adaptability of meta-learning with the analytical prowess of ViT. Meta-learning excels in identifying nuanced similarities across tasks, significantly enhancing learning efficiency. Simultaneously, the ViT leverages these meta-learning insights to navigate the complex landscape of geological exploration, improving lithology identification accuracy. The Meta-ViT model employs a support set to extract crucial insights through meta-learning, and a query set to test the generalizability of these insights. This dual-framework setup enables the ViT to detect various underground rock types with unprecedented precision. Additionally, by simulating diverse tasks and data scenarios, meta-learning broadens the model's applicational scope. Integrating the SHAP (SHapley Additive exPlanations) model, rooted in Shapley values from cooperative game theory, greatly enhances the interpretability of rock type classifications. We also utilized the ADASYN (Adaptive Synthetic Sampling) technique to optimize sample representation, generating new samples based on existing densities to ensure uniform distribution. Our extensive testing across various training and testing set ratios showed that the Meta-ViT model outperforms dramatically traditional machine learning models. This approach not only refines learning processes but it also adeptly addresses the inherent challenges of geological data analysis.

Abstract Image

结合元学习和视觉转换器的可解释 SHAP 模型,利用测井中有限且不平衡的钻井数据进行岩性分类
地质勘探和油气开发领域的最新进展凸显了对地下岩性进行精确分类和预测的迫切需要。为此,我们引入了一种新方法--元视觉转换器(Meta-ViT)方法。这项技术将元学习的适应性与 ViT 的分析能力协同结合在一起。元学习擅长识别不同任务之间的细微相似性,从而显著提高学习效率。同时,ViT 利用这些元学习的洞察力来驾驭地质勘探的复杂局面,提高岩性识别的准确性。Meta-ViT 模型采用一个支持集,通过元学习提取关键见解,并采用一个查询集来测试这些见解的通用性。这种双框架设置使 ViT 能够以前所未有的精度检测各种地下岩石类型。此外,通过模拟不同的任务和数据场景,元学习拓宽了模型的应用范围。SHAP(SHapley Additive exPlanations)模型植根于合作博弈论中的 Shapley 值,它大大提高了岩石类型分类的可解释性。我们还利用 ADASYN(自适应合成采样)技术优化样本代表性,根据现有密度生成新样本,以确保分布均匀。我们对各种训练集和测试集比例进行的广泛测试表明,Meta-ViT 模型的性能大大优于传统的机器学习模型。这种方法不仅完善了学习过程,还巧妙地解决了地质数据分析所面临的固有挑战。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Natural Resources Research
Natural Resources Research Environmental Science-General Environmental Science
CiteScore
11.90
自引率
11.10%
发文量
151
期刊介绍: This journal publishes quantitative studies of natural (mainly but not limited to mineral) resources exploration, evaluation and exploitation, including environmental and risk-related aspects. Typical articles use geoscientific data or analyses to assess, test, or compare resource-related aspects. NRR covers a wide variety of resources including minerals, coal, hydrocarbon, geothermal, water, and vegetation. Case studies are welcome.
文献相关原料
公司名称 产品信息 采购帮参考价格
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信