{"title":"动作捕捉两个人共同处理一个物体的实验装置","authors":"Raza Saeed, Tadele Belay Tuli, Martin Manns","doi":"10.1016/j.procir.2025.02.145","DOIUrl":null,"url":null,"abstract":"<div><div>In manual assembly, human motion capture is difficult. Reproducibly capturing two humans who are jointly handling a single part requires synchronization, communication between humans, and correcting deviations. This paper proposes an experimental setup to capture and analyze human motion data using an overhead light projector that dynamically and visually guides collaborating humans to initial and target positions. The experimental setup comprises full-body, hand-finger, and gaze-tracking sensors. Cyclic synchronization helps to reduce drift, correct position errors, and make the method easily reproducible and robust for collecting human motion data. The accuracy of object motion, which is handled by humans to execute a cooperative task, is assessed. Preliminary results show a promising output that can be scaled up for human-robot interactions using digital human models e.g., to evaluate collaboration actions and identify shared roles.</div></div>","PeriodicalId":20535,"journal":{"name":"Procedia CIRP","volume":"134 ","pages":"Pages 401-406"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Experimental setup for motion capturing two humans jointly handling an object\",\"authors\":\"Raza Saeed, Tadele Belay Tuli, Martin Manns\",\"doi\":\"10.1016/j.procir.2025.02.145\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In manual assembly, human motion capture is difficult. Reproducibly capturing two humans who are jointly handling a single part requires synchronization, communication between humans, and correcting deviations. This paper proposes an experimental setup to capture and analyze human motion data using an overhead light projector that dynamically and visually guides collaborating humans to initial and target positions. The experimental setup comprises full-body, hand-finger, and gaze-tracking sensors. Cyclic synchronization helps to reduce drift, correct position errors, and make the method easily reproducible and robust for collecting human motion data. The accuracy of object motion, which is handled by humans to execute a cooperative task, is assessed. Preliminary results show a promising output that can be scaled up for human-robot interactions using digital human models e.g., to evaluate collaboration actions and identify shared roles.</div></div>\",\"PeriodicalId\":20535,\"journal\":{\"name\":\"Procedia CIRP\",\"volume\":\"134 \",\"pages\":\"Pages 401-406\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Procedia CIRP\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2212827125005177\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Procedia CIRP","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2212827125005177","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Experimental setup for motion capturing two humans jointly handling an object
In manual assembly, human motion capture is difficult. Reproducibly capturing two humans who are jointly handling a single part requires synchronization, communication between humans, and correcting deviations. This paper proposes an experimental setup to capture and analyze human motion data using an overhead light projector that dynamically and visually guides collaborating humans to initial and target positions. The experimental setup comprises full-body, hand-finger, and gaze-tracking sensors. Cyclic synchronization helps to reduce drift, correct position errors, and make the method easily reproducible and robust for collecting human motion data. The accuracy of object motion, which is handled by humans to execute a cooperative task, is assessed. Preliminary results show a promising output that can be scaled up for human-robot interactions using digital human models e.g., to evaluate collaboration actions and identify shared roles.