A design framework for high-fidelity human-centric digital twin of collaborative work cell in Industry 5.0

IF 12.2 1区 工程技术 Q1 ENGINEERING, INDUSTRIAL
Tianyu Wang , Zhihao Liu , Lihui Wang , Mian Li , Xi Vincent Wang
{"title":"A design framework for high-fidelity human-centric digital twin of collaborative work cell in Industry 5.0","authors":"Tianyu Wang ,&nbsp;Zhihao Liu ,&nbsp;Lihui Wang ,&nbsp;Mian Li ,&nbsp;Xi Vincent Wang","doi":"10.1016/j.jmsy.2025.02.018","DOIUrl":null,"url":null,"abstract":"<div><div>Digital Twin (DT) of a manufacturing system mainly involving materials and machines has been widely explored in the past decades to facilitate the mass customization of modern products. Recently, the new vision of Industry 5.0 has brought human operators back to the core part of work cells. To this end, designing human-centric DT systems is vital for an ergonomic and symbiotic working environment. However, one major challenge is the construction and utilization of high-fidelity digital human models. In the literature, preset universal human avatar models such as skeletons are mostly employed to represent the human operators, which overlooks the individual differences of physical traits. Besides, the fundamental utilization features such as motion tracking and procedure recognition still do not well address the practical issues such as occlusions and incomplete observations. To deal with the challenge, this paper proposes a systematic design framework to quickly and precisely build and utilize the human-centric DT systems. The mesh-based customized human operator models with rendered appearances are first generated within one minute from a short motion video. Then transformer-based deep learning networks are developed to realize the motion-related operator status synchronization in complex conditions. Extensive experiments on multiple real-world human–robot collaborative work cells show the superior performance of the proposed framework over the state-of-the-art.</div></div>","PeriodicalId":16227,"journal":{"name":"Journal of Manufacturing Systems","volume":"80 ","pages":"Pages 140-156"},"PeriodicalIF":12.2000,"publicationDate":"2025-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Manufacturing Systems","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0278612525000561","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 0

Abstract

Digital Twin (DT) of a manufacturing system mainly involving materials and machines has been widely explored in the past decades to facilitate the mass customization of modern products. Recently, the new vision of Industry 5.0 has brought human operators back to the core part of work cells. To this end, designing human-centric DT systems is vital for an ergonomic and symbiotic working environment. However, one major challenge is the construction and utilization of high-fidelity digital human models. In the literature, preset universal human avatar models such as skeletons are mostly employed to represent the human operators, which overlooks the individual differences of physical traits. Besides, the fundamental utilization features such as motion tracking and procedure recognition still do not well address the practical issues such as occlusions and incomplete observations. To deal with the challenge, this paper proposes a systematic design framework to quickly and precisely build and utilize the human-centric DT systems. The mesh-based customized human operator models with rendered appearances are first generated within one minute from a short motion video. Then transformer-based deep learning networks are developed to realize the motion-related operator status synchronization in complex conditions. Extensive experiments on multiple real-world human–robot collaborative work cells show the superior performance of the proposed framework over the state-of-the-art.
工业5.0中以人为中心的高保真协同工作单元数字孪生设计框架
为了促进现代产品的大规模定制,在过去的几十年中,人们对主要涉及材料和机器的制造系统的数字孪生(DT)进行了广泛的探索。最近,工业5.0的新愿景将人类操作员带回了工作单元的核心部分。为此,设计以人为中心的DT系统对于符合人体工程学和共生的工作环境至关重要。然而,一个主要的挑战是高保真数字人体模型的构建和利用。文献中多采用预设的通用人类化身模型,如骨骼模型来表示人类操作者,忽略了身体特征的个体差异。此外,运动跟踪和过程识别等基本的应用特性还不能很好地解决遮挡和观察不完全等实际问题。为了应对这一挑战,本文提出了一个系统的设计框架,以快速准确地构建和利用以人为本的DT系统。基于网格的自定义人体操作员模型与渲染外观首先在一分钟内从短运动视频生成。在此基础上,建立了基于变压器的深度学习网络,实现了复杂条件下与运动相关的算子状态同步。在多个真实世界的人机协作工作单元上进行的大量实验表明,所提出的框架优于最先进的框架。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Manufacturing Systems
Journal of Manufacturing Systems 工程技术-工程:工业
CiteScore
23.30
自引率
13.20%
发文量
216
审稿时长
25 days
期刊介绍: The Journal of Manufacturing Systems is dedicated to showcasing cutting-edge fundamental and applied research in manufacturing at the systems level. Encompassing products, equipment, people, information, control, and support functions, manufacturing systems play a pivotal role in the economical and competitive development, production, delivery, and total lifecycle of products, meeting market and societal needs. With a commitment to publishing archival scholarly literature, the journal strives to advance the state of the art in manufacturing systems and foster innovation in crafting efficient, robust, and sustainable manufacturing systems. The focus extends from equipment-level considerations to the broader scope of the extended enterprise. The Journal welcomes research addressing challenges across various scales, including nano, micro, and macro-scale manufacturing, and spanning diverse sectors such as aerospace, automotive, energy, and medical device manufacturing.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信