{"title":"使用压力传感器识别人体物理交互过程中的操作动作","authors":"M. Javaid, M. Žefran, A. Yavolovsky","doi":"10.1109/ROMAN.2015.7333660","DOIUrl":null,"url":null,"abstract":"This paper presents an investigation of human physical interaction. In particular, we describe how data from pressure sensors mounted on a glove worn by a human can be mapped to manipulation actions; the actions can in turn be used to interpret physical interaction during elderly care. The work is part of the RoboHelper project, which aims to build a multimodal communication interface for assistive robots for the elderly. Human-human physical interaction during elderly care and in a realistic setting is studied in this work with the aim of using the learned insights to develop corresponding robot interfaces. The contribution of this work is the identification of various types of physical manipulation actions that take place when an elder is assisted in performing activities of daily living in a natural setting. As part of the RoboHelper project, it has been shown that the knowledge of actions involving physical manipulation of objects helps in understanding the spoken language. More specifically, it improves the resolution of third person pronouns/deictic words and the classification of dialogue acts. In this work we show that pressure sensor data can be used to automatically recognize such physical manipulation actions. The automatic recognition of physical manipulation actions may facilitate future studies of multimodal interaction by greatly reducing the time required for manual annotations. It is also useful for learning from demonstration, a popular approach in human-robot interaction research.","PeriodicalId":119467,"journal":{"name":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Using pressure sensors to identify manipulation actions during human physical interaction\",\"authors\":\"M. Javaid, M. Žefran, A. Yavolovsky\",\"doi\":\"10.1109/ROMAN.2015.7333660\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an investigation of human physical interaction. In particular, we describe how data from pressure sensors mounted on a glove worn by a human can be mapped to manipulation actions; the actions can in turn be used to interpret physical interaction during elderly care. The work is part of the RoboHelper project, which aims to build a multimodal communication interface for assistive robots for the elderly. Human-human physical interaction during elderly care and in a realistic setting is studied in this work with the aim of using the learned insights to develop corresponding robot interfaces. The contribution of this work is the identification of various types of physical manipulation actions that take place when an elder is assisted in performing activities of daily living in a natural setting. As part of the RoboHelper project, it has been shown that the knowledge of actions involving physical manipulation of objects helps in understanding the spoken language. More specifically, it improves the resolution of third person pronouns/deictic words and the classification of dialogue acts. In this work we show that pressure sensor data can be used to automatically recognize such physical manipulation actions. The automatic recognition of physical manipulation actions may facilitate future studies of multimodal interaction by greatly reducing the time required for manual annotations. It is also useful for learning from demonstration, a popular approach in human-robot interaction research.\",\"PeriodicalId\":119467,\"journal\":{\"name\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-11-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.2015.7333660\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2015.7333660","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using pressure sensors to identify manipulation actions during human physical interaction
This paper presents an investigation of human physical interaction. In particular, we describe how data from pressure sensors mounted on a glove worn by a human can be mapped to manipulation actions; the actions can in turn be used to interpret physical interaction during elderly care. The work is part of the RoboHelper project, which aims to build a multimodal communication interface for assistive robots for the elderly. Human-human physical interaction during elderly care and in a realistic setting is studied in this work with the aim of using the learned insights to develop corresponding robot interfaces. The contribution of this work is the identification of various types of physical manipulation actions that take place when an elder is assisted in performing activities of daily living in a natural setting. As part of the RoboHelper project, it has been shown that the knowledge of actions involving physical manipulation of objects helps in understanding the spoken language. More specifically, it improves the resolution of third person pronouns/deictic words and the classification of dialogue acts. In this work we show that pressure sensor data can be used to automatically recognize such physical manipulation actions. The automatic recognition of physical manipulation actions may facilitate future studies of multimodal interaction by greatly reducing the time required for manual annotations. It is also useful for learning from demonstration, a popular approach in human-robot interaction research.