Yu Fu , Mingguo Wang , Chengbin Wang , Shuaixian Dong , Jianguo Chen , Jiyuan Wang , Hongping Yu , Jing Huang , Liheng Chang , Bo Wang
{"title":"GeoMinLM: A Large Language Model in Geology and Mineral Survey in Yunnan Province","authors":"Yu Fu , Mingguo Wang , Chengbin Wang , Shuaixian Dong , Jianguo Chen , Jiyuan Wang , Hongping Yu , Jing Huang , Liheng Chang , Bo Wang","doi":"10.1016/j.oregeorev.2025.106638","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, the development of artificial intelligence and big data technologies has led to the advancement of tools and solutions for transforming the geological and mineral survey paradigm, which requires a large amount of geological knowledge in a complex and arduous working environment. The large language model (LLM) has a significant advantage in answering generative intelligent questions. However, LLMs for general fields have limitations in answering professional questions in a vertical domain like geology. To overcome this challenge, we proposed and developed GeoMinLM, an LLM for geological and mineral exploration scenarios in Yunnan Province, and explored its applications in intelligent Q&A. Leveraging a proprietary dataset of 5.16 million words in geology and mineral exploration, we trained GeoMinLM based on Baichuan-2, achieving superior performance through fine-tuning and hyperparameter optimization. By integrating expert knowledge via a knowledge graph, we significantly reduced hallucinations and enhanced professionalism. This study proves that GeoMinLM is helpful for accurate information retrieval and knowledge dissemination, thereby supporting the intelligent advancement of geological and mineral fields.</div></div>","PeriodicalId":19644,"journal":{"name":"Ore Geology Reviews","volume":"182 ","pages":"Article 106638"},"PeriodicalIF":3.2000,"publicationDate":"2025-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ore Geology Reviews","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169136825001982","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, the development of artificial intelligence and big data technologies has led to the advancement of tools and solutions for transforming the geological and mineral survey paradigm, which requires a large amount of geological knowledge in a complex and arduous working environment. The large language model (LLM) has a significant advantage in answering generative intelligent questions. However, LLMs for general fields have limitations in answering professional questions in a vertical domain like geology. To overcome this challenge, we proposed and developed GeoMinLM, an LLM for geological and mineral exploration scenarios in Yunnan Province, and explored its applications in intelligent Q&A. Leveraging a proprietary dataset of 5.16 million words in geology and mineral exploration, we trained GeoMinLM based on Baichuan-2, achieving superior performance through fine-tuning and hyperparameter optimization. By integrating expert knowledge via a knowledge graph, we significantly reduced hallucinations and enhanced professionalism. This study proves that GeoMinLM is helpful for accurate information retrieval and knowledge dissemination, thereby supporting the intelligent advancement of geological and mineral fields.
期刊介绍:
Ore Geology Reviews aims to familiarize all earth scientists with recent advances in a number of interconnected disciplines related to the study of, and search for, ore deposits. The reviews range from brief to longer contributions, but the journal preferentially publishes manuscripts that fill the niche between the commonly shorter journal articles and the comprehensive book coverages, and thus has a special appeal to many authors and readers.