Antonio Pérez Cruz, Abiel Aguilar-González, Madaín Pérez Patricio
{"title":"Towards an FPGA-Based Smart Camera for Virtual Reality Applications","authors":"Antonio Pérez Cruz, Abiel Aguilar-González, Madaín Pérez Patricio","doi":"10.1145/3349801.3357133","DOIUrl":null,"url":null,"abstract":"Virtual reality (VR) is an experience taking place within simulated and immersive environments. Although in recent years several virtual reality applications such as, virtual reality gaming, medical educational and military training applications have been developed; one important limitation still remains for the tracking sensor. Commercial headsets such as the Oculus Rift or HTC Vive have tracking sensors which project active signals to the user's body and limits the motion understanding. To address this problem, we propose a novel passive sensor (which consist of an FPGA-based smart camera) which computes the optical flow an estimates semantic information about the user movement inside the camera fabric. Then, using these semantic information as feedback for the virtual reality engine; accurate tracking without active signals being projected to the user's body and with the capability to implement several cameras in order to achieve a better movement understanding is possible. Preliminary results are encourageous, demonstrating the possibility of a visual-based tracking approach suitable for virtual reality applications.","PeriodicalId":299138,"journal":{"name":"Proceedings of the 13th International Conference on Distributed Smart Cameras","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 13th International Conference on Distributed Smart Cameras","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3349801.3357133","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Virtual reality (VR) is an experience taking place within simulated and immersive environments. Although in recent years several virtual reality applications such as, virtual reality gaming, medical educational and military training applications have been developed; one important limitation still remains for the tracking sensor. Commercial headsets such as the Oculus Rift or HTC Vive have tracking sensors which project active signals to the user's body and limits the motion understanding. To address this problem, we propose a novel passive sensor (which consist of an FPGA-based smart camera) which computes the optical flow an estimates semantic information about the user movement inside the camera fabric. Then, using these semantic information as feedback for the virtual reality engine; accurate tracking without active signals being projected to the user's body and with the capability to implement several cameras in order to achieve a better movement understanding is possible. Preliminary results are encourageous, demonstrating the possibility of a visual-based tracking approach suitable for virtual reality applications.