M. Cabral, Gabriel Roque, Mario Nagamura, Andre Montes, Eduardo Zilles Borba, C. Kurashima, M. Zuffo
{"title":"Batmen - Hybrid collaborative object manipulation using mobile devices","authors":"M. Cabral, Gabriel Roque, Mario Nagamura, Andre Montes, Eduardo Zilles Borba, C. Kurashima, M. Zuffo","doi":"10.1109/3DUI.2016.7460077","DOIUrl":null,"url":null,"abstract":"We present an interactive and collaborative 3D object manipulation system using off the shelf mobile devices coupled with Augmented Reality (AR) technology that allows multiple users to collaborate concurrently on a scene. Each user interested in participating in this collaboration uses both a mobile device running android and a desktop (or laptop) working in tandem. The 3D scene is visualized by the user in the desktop system. The changes in the scene viewpoint and the object manipulation are performed using a mobile device through object tracking. Multiple users can collaborate on object manipulation by using a laptop and a mobile device each. The system leverages user's knowledge of common tasks performed on current mobile devices using gestures. We built a prototype system that allows users to complete the requested tasks and performed an informal user study with experienced VR researchers to validate the system.","PeriodicalId":175060,"journal":{"name":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"256 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE Symposium on 3D User Interfaces (3DUI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2016.7460077","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
We present an interactive and collaborative 3D object manipulation system using off the shelf mobile devices coupled with Augmented Reality (AR) technology that allows multiple users to collaborate concurrently on a scene. Each user interested in participating in this collaboration uses both a mobile device running android and a desktop (or laptop) working in tandem. The 3D scene is visualized by the user in the desktop system. The changes in the scene viewpoint and the object manipulation are performed using a mobile device through object tracking. Multiple users can collaborate on object manipulation by using a laptop and a mobile device each. The system leverages user's knowledge of common tasks performed on current mobile devices using gestures. We built a prototype system that allows users to complete the requested tasks and performed an informal user study with experienced VR researchers to validate the system.