Geo-SegNet: A contrastive learning enhanced U-net for geomaterial segmentation

Qinyi Tian , Sara Goodhue , Hou Xiong , Laura E. Dalton
{"title":"Geo-SegNet: A contrastive learning enhanced U-net for geomaterial segmentation","authors":"Qinyi Tian ,&nbsp;Sara Goodhue ,&nbsp;Hou Xiong ,&nbsp;Laura E. Dalton","doi":"10.1016/j.tmater.2025.100049","DOIUrl":null,"url":null,"abstract":"<div><div>X-ray micro-computed tomography scanning and tomographic image processing is a robust method to quantify various features in geomaterials. The accuracy of the segmented results can be affected by factors including scan resolution, scanning artifacts, and human bias. To overcome these limitations, deep learning techniques are being explored to address these challenges. In the present study, a novel deep learning model called Geo-SegNet was developed to enhance segmentation accuracy over traditional U-Net models. Geo-SegNet employs contrastive learning for feature extraction by integrating this extractor as the encoder in a U-Net architecture. The model is tested using 10 feet of sandstone cores containing significant changes in porosity and pore geometries and the segmentation results are compared to common segmentation methods and U-Net. Compared to a U-Net-only model, Geo-SegNet demonstrates a 2.0 % increase in segmentation accuracy, indicating the potential of the model to improve the segmentation porosity which can also improve subsequent metrics such as permeability.</div></div>","PeriodicalId":101254,"journal":{"name":"Tomography of Materials and Structures","volume":"7 ","pages":"Article 100049"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tomography of Materials and Structures","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949673X25000026","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

X-ray micro-computed tomography scanning and tomographic image processing is a robust method to quantify various features in geomaterials. The accuracy of the segmented results can be affected by factors including scan resolution, scanning artifacts, and human bias. To overcome these limitations, deep learning techniques are being explored to address these challenges. In the present study, a novel deep learning model called Geo-SegNet was developed to enhance segmentation accuracy over traditional U-Net models. Geo-SegNet employs contrastive learning for feature extraction by integrating this extractor as the encoder in a U-Net architecture. The model is tested using 10 feet of sandstone cores containing significant changes in porosity and pore geometries and the segmentation results are compared to common segmentation methods and U-Net. Compared to a U-Net-only model, Geo-SegNet demonstrates a 2.0 % increase in segmentation accuracy, indicating the potential of the model to improve the segmentation porosity which can also improve subsequent metrics such as permeability.
Geo-SegNet:一种增强对比学习的U-net地质材料分割方法
x射线微计算机断层扫描和层析成像图像处理是一种量化地质材料各种特征的鲁棒方法。分割结果的准确性可能受到扫描分辨率、扫描伪影和人为偏差等因素的影响。为了克服这些限制,人们正在探索深度学习技术来应对这些挑战。在本研究中,开发了一种称为Geo-SegNet的新型深度学习模型,以提高传统U-Net模型的分割精度。Geo-SegNet通过将该提取器集成为U-Net架构中的编码器,采用对比学习进行特征提取。该模型使用10英尺的砂岩岩心进行测试,其中孔隙度和孔隙几何形状发生了重大变化,并将分割结果与普通分割方法和U-Net进行了比较。与仅使用u - net的模型相比,Geo-SegNet的分割精度提高了2.0 %,这表明该模型有可能改善分割孔隙度,从而改善渗透率等后续指标。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信