Yuta Horikawa, Asuka Egashira, Kazuto Nakashima, A. Kawamura, R. Kurazume
{"title":"预览现实:近未来感知系统","authors":"Yuta Horikawa, Asuka Egashira, Kazuto Nakashima, A. Kawamura, R. Kurazume","doi":"10.1109/IROS.2017.8202181","DOIUrl":null,"url":null,"abstract":"This paper presents a near-future perception system named “Previewed Reality”. The system consists of an informationally structured environment (ISE), an immersive VR display, a stereo camera, an optical tracking system, and a dynamic simulator. In an ISE, a number of sensors are embedded, and information such as the position of furniture, objects, humans, and robots, is sensed and stored in a database. The position and orientation of the immersive VR display are also tracked by an optical tracking system. Therefore, we can forecast the next possible events using a dynamic simulator and synthesize virtual images of what users will see in the near future from their own viewpoint. The synthesized images, overlaid on a real scene by using augmented reality technology, are presented to the user. The proposed system can allow a human and a robot to coexist more safely by showing possible hazardous situations to the human intuitively in advance.","PeriodicalId":6658,"journal":{"name":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"86 1","pages":"370-375"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Previewed reality: Near-future perception system\",\"authors\":\"Yuta Horikawa, Asuka Egashira, Kazuto Nakashima, A. Kawamura, R. Kurazume\",\"doi\":\"10.1109/IROS.2017.8202181\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a near-future perception system named “Previewed Reality”. The system consists of an informationally structured environment (ISE), an immersive VR display, a stereo camera, an optical tracking system, and a dynamic simulator. In an ISE, a number of sensors are embedded, and information such as the position of furniture, objects, humans, and robots, is sensed and stored in a database. The position and orientation of the immersive VR display are also tracked by an optical tracking system. Therefore, we can forecast the next possible events using a dynamic simulator and synthesize virtual images of what users will see in the near future from their own viewpoint. The synthesized images, overlaid on a real scene by using augmented reality technology, are presented to the user. The proposed system can allow a human and a robot to coexist more safely by showing possible hazardous situations to the human intuitively in advance.\",\"PeriodicalId\":6658,\"journal\":{\"name\":\"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)\",\"volume\":\"86 1\",\"pages\":\"370-375\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IROS.2017.8202181\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2017.8202181","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper presents a near-future perception system named “Previewed Reality”. The system consists of an informationally structured environment (ISE), an immersive VR display, a stereo camera, an optical tracking system, and a dynamic simulator. In an ISE, a number of sensors are embedded, and information such as the position of furniture, objects, humans, and robots, is sensed and stored in a database. The position and orientation of the immersive VR display are also tracked by an optical tracking system. Therefore, we can forecast the next possible events using a dynamic simulator and synthesize virtual images of what users will see in the near future from their own viewpoint. The synthesized images, overlaid on a real scene by using augmented reality technology, are presented to the user. The proposed system can allow a human and a robot to coexist more safely by showing possible hazardous situations to the human intuitively in advance.