A 3DGS and LLM-based physical-to-virtual approach for human-robot interactive manufacturing

IF 2 Q3 ENGINEERING, MANUFACTURING
Wenhang Dong , Shufei Li , Pai Zheng , Liang Liu , Shuo Chen
{"title":"A 3DGS and LLM-based physical-to-virtual approach for human-robot interactive manufacturing","authors":"Wenhang Dong ,&nbsp;Shufei Li ,&nbsp;Pai Zheng ,&nbsp;Liang Liu ,&nbsp;Shuo Chen","doi":"10.1016/j.mfglet.2025.06.016","DOIUrl":null,"url":null,"abstract":"<div><div>With the exploration of digital transformation in the industry, the introduction of the industrial metaverse is bringing unprecedented opportunities and challenges to the manufacturing industry. In the industrial metaverse, humans can interact safely and naturally with robots in high-fidelity digital environments, enabling non-technical operators to quickly validate industrial scenarios and help optimize decision-making and production processes. However, the complexity of Three-Dimensional (3D) modeling poses a challenge to achieving this goal. Additionally, programming-based Human Robot Interaction (HRI) also presents obstacles, as operators need significant time to learn how to control robots. Therefore, this paper proposes a 3D Gaussian Splatting (3DGS) and Large Language Model (LLM)-based physical-to-virtual approach for human-robot interactive manufacturing, which further facilitates digital interaction for non-technical operators in manufacturing environments. Specifically, 3DGS is first used for rapid visualization and reconstruction of the overall scene, achieving new perspective rendering and providing a gaussian ellipsoid representation. Then mesh extraction algorithms based on gaussian representation are used to build a physical-to-virtual transfer framework. Finally, LLM is utilized for understanding natural language commands and generating virtual robot Python programming to complete robot assembly tasks. This framework is implemented in the Isaac Sim simulator, and the case study shows that the proposed framework can quickly and accurately complete physical-to-virtual transfer and accomplish robot assembly manufacturing tasks in the simulator with low code.</div></div>","PeriodicalId":38186,"journal":{"name":"Manufacturing Letters","volume":"44 ","pages":"Pages 121-128"},"PeriodicalIF":2.0000,"publicationDate":"2025-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Manufacturing Letters","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2213846325000422","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
引用次数: 0

Abstract

With the exploration of digital transformation in the industry, the introduction of the industrial metaverse is bringing unprecedented opportunities and challenges to the manufacturing industry. In the industrial metaverse, humans can interact safely and naturally with robots in high-fidelity digital environments, enabling non-technical operators to quickly validate industrial scenarios and help optimize decision-making and production processes. However, the complexity of Three-Dimensional (3D) modeling poses a challenge to achieving this goal. Additionally, programming-based Human Robot Interaction (HRI) also presents obstacles, as operators need significant time to learn how to control robots. Therefore, this paper proposes a 3D Gaussian Splatting (3DGS) and Large Language Model (LLM)-based physical-to-virtual approach for human-robot interactive manufacturing, which further facilitates digital interaction for non-technical operators in manufacturing environments. Specifically, 3DGS is first used for rapid visualization and reconstruction of the overall scene, achieving new perspective rendering and providing a gaussian ellipsoid representation. Then mesh extraction algorithms based on gaussian representation are used to build a physical-to-virtual transfer framework. Finally, LLM is utilized for understanding natural language commands and generating virtual robot Python programming to complete robot assembly tasks. This framework is implemented in the Isaac Sim simulator, and the case study shows that the proposed framework can quickly and accurately complete physical-to-virtual transfer and accomplish robot assembly manufacturing tasks in the simulator with low code.
基于3DGS和llm的人机交互制造的物理到虚拟方法
随着工业数字化转型的探索,工业元宇宙的引入给制造业带来了前所未有的机遇和挑战。在工业虚拟世界中,人类可以在高保真数字环境中安全、自然地与机器人交互,使非技术操作员能够快速验证工业场景,并帮助优化决策和生产流程。然而,三维(3D)建模的复杂性对实现这一目标提出了挑战。此外,基于编程的人机交互(HRI)也存在障碍,因为操作员需要大量时间来学习如何控制机器人。因此,本文提出了一种基于三维高斯飞溅(3DGS)和大语言模型(LLM)的人机交互制造的物理到虚拟方法,进一步促进了制造环境中非技术操作人员的数字化交互。具体而言,3DGS首先用于整个场景的快速可视化和重建,实现了新的视角渲染,并提供了高斯椭球表示。然后利用基于高斯表示的网格提取算法构建了一个物理到虚拟的传输框架。最后利用LLM理解自然语言命令,生成虚拟机器人Python编程,完成机器人装配任务。该框架在Isaac Sim模拟器中实现,实例研究表明,该框架可以快速准确地完成物理到虚拟的转换,以低代码完成模拟器中的机器人装配制造任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Manufacturing Letters
Manufacturing Letters Engineering-Industrial and Manufacturing Engineering
CiteScore
4.20
自引率
5.10%
发文量
192
审稿时长
60 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信