Wenhang Dong , Shufei Li , Pai Zheng , Liang Liu , Shuo Chen
{"title":"基于3DGS和llm的人机交互制造的物理到虚拟方法","authors":"Wenhang Dong , Shufei Li , Pai Zheng , Liang Liu , Shuo Chen","doi":"10.1016/j.mfglet.2025.06.016","DOIUrl":null,"url":null,"abstract":"<div><div>With the exploration of digital transformation in the industry, the introduction of the industrial metaverse is bringing unprecedented opportunities and challenges to the manufacturing industry. In the industrial metaverse, humans can interact safely and naturally with robots in high-fidelity digital environments, enabling non-technical operators to quickly validate industrial scenarios and help optimize decision-making and production processes. However, the complexity of Three-Dimensional (3D) modeling poses a challenge to achieving this goal. Additionally, programming-based Human Robot Interaction (HRI) also presents obstacles, as operators need significant time to learn how to control robots. Therefore, this paper proposes a 3D Gaussian Splatting (3DGS) and Large Language Model (LLM)-based physical-to-virtual approach for human-robot interactive manufacturing, which further facilitates digital interaction for non-technical operators in manufacturing environments. Specifically, 3DGS is first used for rapid visualization and reconstruction of the overall scene, achieving new perspective rendering and providing a gaussian ellipsoid representation. Then mesh extraction algorithms based on gaussian representation are used to build a physical-to-virtual transfer framework. Finally, LLM is utilized for understanding natural language commands and generating virtual robot Python programming to complete robot assembly tasks. This framework is implemented in the Isaac Sim simulator, and the case study shows that the proposed framework can quickly and accurately complete physical-to-virtual transfer and accomplish robot assembly manufacturing tasks in the simulator with low code.</div></div>","PeriodicalId":38186,"journal":{"name":"Manufacturing Letters","volume":"44 ","pages":"Pages 121-128"},"PeriodicalIF":2.0000,"publicationDate":"2025-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A 3DGS and LLM-based physical-to-virtual approach for human-robot interactive manufacturing\",\"authors\":\"Wenhang Dong , Shufei Li , Pai Zheng , Liang Liu , Shuo Chen\",\"doi\":\"10.1016/j.mfglet.2025.06.016\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>With the exploration of digital transformation in the industry, the introduction of the industrial metaverse is bringing unprecedented opportunities and challenges to the manufacturing industry. In the industrial metaverse, humans can interact safely and naturally with robots in high-fidelity digital environments, enabling non-technical operators to quickly validate industrial scenarios and help optimize decision-making and production processes. However, the complexity of Three-Dimensional (3D) modeling poses a challenge to achieving this goal. Additionally, programming-based Human Robot Interaction (HRI) also presents obstacles, as operators need significant time to learn how to control robots. Therefore, this paper proposes a 3D Gaussian Splatting (3DGS) and Large Language Model (LLM)-based physical-to-virtual approach for human-robot interactive manufacturing, which further facilitates digital interaction for non-technical operators in manufacturing environments. Specifically, 3DGS is first used for rapid visualization and reconstruction of the overall scene, achieving new perspective rendering and providing a gaussian ellipsoid representation. Then mesh extraction algorithms based on gaussian representation are used to build a physical-to-virtual transfer framework. Finally, LLM is utilized for understanding natural language commands and generating virtual robot Python programming to complete robot assembly tasks. This framework is implemented in the Isaac Sim simulator, and the case study shows that the proposed framework can quickly and accurately complete physical-to-virtual transfer and accomplish robot assembly manufacturing tasks in the simulator with low code.</div></div>\",\"PeriodicalId\":38186,\"journal\":{\"name\":\"Manufacturing Letters\",\"volume\":\"44 \",\"pages\":\"Pages 121-128\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2025-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Manufacturing Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2213846325000422\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, MANUFACTURING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Manufacturing Letters","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2213846325000422","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MANUFACTURING","Score":null,"Total":0}
A 3DGS and LLM-based physical-to-virtual approach for human-robot interactive manufacturing
With the exploration of digital transformation in the industry, the introduction of the industrial metaverse is bringing unprecedented opportunities and challenges to the manufacturing industry. In the industrial metaverse, humans can interact safely and naturally with robots in high-fidelity digital environments, enabling non-technical operators to quickly validate industrial scenarios and help optimize decision-making and production processes. However, the complexity of Three-Dimensional (3D) modeling poses a challenge to achieving this goal. Additionally, programming-based Human Robot Interaction (HRI) also presents obstacles, as operators need significant time to learn how to control robots. Therefore, this paper proposes a 3D Gaussian Splatting (3DGS) and Large Language Model (LLM)-based physical-to-virtual approach for human-robot interactive manufacturing, which further facilitates digital interaction for non-technical operators in manufacturing environments. Specifically, 3DGS is first used for rapid visualization and reconstruction of the overall scene, achieving new perspective rendering and providing a gaussian ellipsoid representation. Then mesh extraction algorithms based on gaussian representation are used to build a physical-to-virtual transfer framework. Finally, LLM is utilized for understanding natural language commands and generating virtual robot Python programming to complete robot assembly tasks. This framework is implemented in the Isaac Sim simulator, and the case study shows that the proposed framework can quickly and accurately complete physical-to-virtual transfer and accomplish robot assembly manufacturing tasks in the simulator with low code.