MechGPT, a Language-Based Strategy for Mechanics and Materials Modeling That Connects Knowledge Across Scales, Disciplines and Modalities

IF 12.2 1区 工程技术 Q1 MECHANICS
Markus J. Buehler
{"title":"MechGPT, a Language-Based Strategy for Mechanics and Materials Modeling That Connects Knowledge Across Scales, Disciplines and Modalities","authors":"Markus J. Buehler","doi":"10.1115/1.4063843","DOIUrl":null,"url":null,"abstract":"For centuries, researchers have sought out ways to connect disparate areas of knowledge. While early scholars (Galileo, da Vinci, etc.) were experts across fields, specialization has taken hold later. With the advent of Artificial Intelligence, we can now explore relationships across areas (e.g., mechanics-biology) or disparate domains (e.g., failure mechanics-art). To achieve this, we use a fine-tuned Large Language Model (LLM), here for a subset of knowledge in multiscale materials failure. The approach includes the use of a general-purpose LLM to distill question-answer pairs from raw sources followed by LLM fine-tuning. The resulting MechGPT LLM foundation model is used in a series of computational experiments to explore its capacity for knowledge retrieval, various language tasks, hypothesis generation, and connecting knowledge across disparate areas. While the model has some ability to recall knowledge from training, we find that LLMs are particularly useful to extract structural insights through Ontological Knowledge Graphs. These interpretable graph structures provide explanatory insights, frameworks for new research questions, and visual representations of knowledge that also can be used in retrieval-augmented generation. Three versions of MechGPT are discussed, featuring different sizes from 13 billion to 70 billion parameters, and reaching context lengths of more than 10,000 tokens. This provides ample capacity for sophisticated retrieval augmented strategies, as well as agent-based modeling where multiple LLMs interact collaboratively and/or adversarially, the incorporation of new data from the literature or web searches, as well as multimodality.","PeriodicalId":8048,"journal":{"name":"Applied Mechanics Reviews","volume":"30 1","pages":"0"},"PeriodicalIF":12.2000,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Mechanics Reviews","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4063843","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MECHANICS","Score":null,"Total":0}
引用次数: 1

Abstract

For centuries, researchers have sought out ways to connect disparate areas of knowledge. While early scholars (Galileo, da Vinci, etc.) were experts across fields, specialization has taken hold later. With the advent of Artificial Intelligence, we can now explore relationships across areas (e.g., mechanics-biology) or disparate domains (e.g., failure mechanics-art). To achieve this, we use a fine-tuned Large Language Model (LLM), here for a subset of knowledge in multiscale materials failure. The approach includes the use of a general-purpose LLM to distill question-answer pairs from raw sources followed by LLM fine-tuning. The resulting MechGPT LLM foundation model is used in a series of computational experiments to explore its capacity for knowledge retrieval, various language tasks, hypothesis generation, and connecting knowledge across disparate areas. While the model has some ability to recall knowledge from training, we find that LLMs are particularly useful to extract structural insights through Ontological Knowledge Graphs. These interpretable graph structures provide explanatory insights, frameworks for new research questions, and visual representations of knowledge that also can be used in retrieval-augmented generation. Three versions of MechGPT are discussed, featuring different sizes from 13 billion to 70 billion parameters, and reaching context lengths of more than 10,000 tokens. This provides ample capacity for sophisticated retrieval augmented strategies, as well as agent-based modeling where multiple LLMs interact collaboratively and/or adversarially, the incorporation of new data from the literature or web searches, as well as multimodality.
MechGPT,一种基于语言的力学和材料建模策略,它连接了跨尺度、学科和模式的知识
几个世纪以来,研究人员一直在寻找将不同领域的知识联系起来的方法。虽然早期的学者(伽利略、达·芬奇等)是各个领域的专家,但专业化后来占据了主导地位。随着人工智能的出现,我们现在可以探索跨领域(如力学-生物学)或不同领域(如失效力学-艺术)的关系。为了实现这一点,我们使用了一个微调的大语言模型(LLM),这里是多尺度材料失效的知识子集。该方法包括使用通用LLM从原始资源中提取问题-答案对,然后进行LLM微调。所得到的MechGPT LLM基础模型被用于一系列计算实验,以探索其知识检索、各种语言任务、假设生成和跨不同领域知识连接的能力。虽然该模型具有一定的从训练中回忆知识的能力,但我们发现llm在通过本体知识图提取结构见解方面特别有用。这些可解释的图结构为新的研究问题提供了解释性的见解、框架和知识的可视化表示,也可以用于检索增强生成。本文讨论了MechGPT的三个版本,具有从130亿个到700亿个参数的不同大小,并且上下文长度超过10,000个令牌。这为复杂的检索增强策略提供了充足的能力,以及基于代理的建模,其中多个llm协作和/或对抗交互,合并来自文献或网络搜索的新数据,以及多模态。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
28.20
自引率
0.70%
发文量
13
审稿时长
>12 weeks
期刊介绍: Applied Mechanics Reviews (AMR) is an international review journal that serves as a premier venue for dissemination of material across all subdisciplines of applied mechanics and engineering science, including fluid and solid mechanics, heat transfer, dynamics and vibration, and applications.AMR provides an archival repository for state-of-the-art and retrospective survey articles and reviews of research areas and curricular developments. The journal invites commentary on research and education policy in different countries. The journal also invites original tutorial and educational material in applied mechanics targeting non-specialist audiences, including undergraduate and K-12 students.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信