S. Alves, A. Uribe-Quevedo, Delun Chen, Jon Morris, Sina Radmard
{"title":"利用数字双胞胎开发机器人导航和人机交互的VR模拟器","authors":"S. Alves, A. Uribe-Quevedo, Delun Chen, Jon Morris, Sina Radmard","doi":"10.1109/VRW55335.2022.00036","DOIUrl":null,"url":null,"abstract":"Providing care to seniors and adults with Developmental Disabilities (DD) has seen increased use and development of assistive technologies including service robots. Such robots ease the challenges associated with care, companionship, medication intake, and fall prevention, among others. Research and development in this field rely on in-person data collection to ensure proper robot navigation, interactions, and service. However, the current COVID-19 pandemic has caused the implementation of physical distancing and access restrictions to long-term care facilities, thus making data collection very difficult. This traditional method poses numerous challenges as videos may not be representative of the population in terms of how people move, interact with the environment, or fall. In this paper, we present the development of a VR simulator for robotics navigation and fall detection with digital twins as a solution to test the virtual robot without having access to the real physical location, or real people. The development process required the development of virtual sensors that are able to create LIDAR data for the virtual robot to navigate and detect obstacles. Preliminary testing has allowed us to obtain promising results for the virtual simulator to train a service robot to navigate and detect falls. Our results include virtual maps, robot navigation, and fall detection.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Developing a VR Simulator for Robotics Navigation and Human Robot Interactions employing Digital Twins\",\"authors\":\"S. Alves, A. Uribe-Quevedo, Delun Chen, Jon Morris, Sina Radmard\",\"doi\":\"10.1109/VRW55335.2022.00036\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Providing care to seniors and adults with Developmental Disabilities (DD) has seen increased use and development of assistive technologies including service robots. Such robots ease the challenges associated with care, companionship, medication intake, and fall prevention, among others. Research and development in this field rely on in-person data collection to ensure proper robot navigation, interactions, and service. However, the current COVID-19 pandemic has caused the implementation of physical distancing and access restrictions to long-term care facilities, thus making data collection very difficult. This traditional method poses numerous challenges as videos may not be representative of the population in terms of how people move, interact with the environment, or fall. In this paper, we present the development of a VR simulator for robotics navigation and fall detection with digital twins as a solution to test the virtual robot without having access to the real physical location, or real people. The development process required the development of virtual sensors that are able to create LIDAR data for the virtual robot to navigate and detect obstacles. Preliminary testing has allowed us to obtain promising results for the virtual simulator to train a service robot to navigate and detect falls. Our results include virtual maps, robot navigation, and fall detection.\",\"PeriodicalId\":326252,\"journal\":{\"name\":\"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)\",\"volume\":\"68 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VRW55335.2022.00036\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VRW55335.2022.00036","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Developing a VR Simulator for Robotics Navigation and Human Robot Interactions employing Digital Twins
Providing care to seniors and adults with Developmental Disabilities (DD) has seen increased use and development of assistive technologies including service robots. Such robots ease the challenges associated with care, companionship, medication intake, and fall prevention, among others. Research and development in this field rely on in-person data collection to ensure proper robot navigation, interactions, and service. However, the current COVID-19 pandemic has caused the implementation of physical distancing and access restrictions to long-term care facilities, thus making data collection very difficult. This traditional method poses numerous challenges as videos may not be representative of the population in terms of how people move, interact with the environment, or fall. In this paper, we present the development of a VR simulator for robotics navigation and fall detection with digital twins as a solution to test the virtual robot without having access to the real physical location, or real people. The development process required the development of virtual sensors that are able to create LIDAR data for the virtual robot to navigate and detect obstacles. Preliminary testing has allowed us to obtain promising results for the virtual simulator to train a service robot to navigate and detect falls. Our results include virtual maps, robot navigation, and fall detection.