{"title":"面向增强现实中的装配步骤识别","authors":"A. Rukubayihunga, Jean-Yves Didier, S. Otmane","doi":"10.1145/2927929.2927953","DOIUrl":null,"url":null,"abstract":"Augmented Reality is a media which purpose is to attach digital information to real world scenes in order to enhance the user experience. It has been used in the field of maintenance in order to show the user the operations he has to perform. Our goal is to go one step further so that our system is able to detect when the user has performed a step of the task. It requires some understanding of what is occurring and where objects are located in order to display correct instructions for the task. This paper is focusing on using an intermediate computation result of the usual augmented reality process, which is the pose computation: we propose to use the transformation matrix not only for objects pose estimation, but also to characterise their motion during an assembly task. With this matrix, we can induce spatial relationship between assembly parts and determine which motion occurs. Then we analyse translation and rotation parameters contained in the transformation matrix during the action. We demonstrate that these data correctly characterise the movement between object's fragments. Therefore, by analysing such a matrix, not only we can achieve the required registration step of the augmented reality process, but we can also understand the actions performed by the user.","PeriodicalId":113875,"journal":{"name":"Proceedings of the 2016 Virtual Reality International Conference","volume":"387 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Towards assembly steps recognition in augmented reality\",\"authors\":\"A. Rukubayihunga, Jean-Yves Didier, S. Otmane\",\"doi\":\"10.1145/2927929.2927953\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Augmented Reality is a media which purpose is to attach digital information to real world scenes in order to enhance the user experience. It has been used in the field of maintenance in order to show the user the operations he has to perform. Our goal is to go one step further so that our system is able to detect when the user has performed a step of the task. It requires some understanding of what is occurring and where objects are located in order to display correct instructions for the task. This paper is focusing on using an intermediate computation result of the usual augmented reality process, which is the pose computation: we propose to use the transformation matrix not only for objects pose estimation, but also to characterise their motion during an assembly task. With this matrix, we can induce spatial relationship between assembly parts and determine which motion occurs. Then we analyse translation and rotation parameters contained in the transformation matrix during the action. We demonstrate that these data correctly characterise the movement between object's fragments. Therefore, by analysing such a matrix, not only we can achieve the required registration step of the augmented reality process, but we can also understand the actions performed by the user.\",\"PeriodicalId\":113875,\"journal\":{\"name\":\"Proceedings of the 2016 Virtual Reality International Conference\",\"volume\":\"387 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2016 Virtual Reality International Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2927929.2927953\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2016 Virtual Reality International Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2927929.2927953","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards assembly steps recognition in augmented reality
Augmented Reality is a media which purpose is to attach digital information to real world scenes in order to enhance the user experience. It has been used in the field of maintenance in order to show the user the operations he has to perform. Our goal is to go one step further so that our system is able to detect when the user has performed a step of the task. It requires some understanding of what is occurring and where objects are located in order to display correct instructions for the task. This paper is focusing on using an intermediate computation result of the usual augmented reality process, which is the pose computation: we propose to use the transformation matrix not only for objects pose estimation, but also to characterise their motion during an assembly task. With this matrix, we can induce spatial relationship between assembly parts and determine which motion occurs. Then we analyse translation and rotation parameters contained in the transformation matrix during the action. We demonstrate that these data correctly characterise the movement between object's fragments. Therefore, by analysing such a matrix, not only we can achieve the required registration step of the augmented reality process, but we can also understand the actions performed by the user.