{"title":"移动AR集成视图-输入交互方法","authors":"T. Tanikawa, H. Uzuka, Takuji Narumi, M. Hirose","doi":"10.1109/3DUI.2015.7131763","DOIUrl":null,"url":null,"abstract":"Recently, mobile AR is very popular and used for many commercial and product promotional activities. However, in almost all mobile AR application, user only view annotated information or preset virtual object's motion in AR environment and cannot interact with virtual objects as if he/she interact real objects in the real environment. In this paper, we propose novel interaction method, called integrated view-input interaction method, which integrate viewpoint moving and virtual objects handling only by handling mobile AR device. Our proposed method has a predilection for popular touch mobile devise, such as smart phone or tablet, does not need any additional sensor for sensing manipulation target. We implemented three integration types and evaluate efficiency in object handling task.","PeriodicalId":131267,"journal":{"name":"2015 IEEE Symposium on 3D User Interfaces (3DUI)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Integrated view-input interaction method for mobile AR\",\"authors\":\"T. Tanikawa, H. Uzuka, Takuji Narumi, M. Hirose\",\"doi\":\"10.1109/3DUI.2015.7131763\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently, mobile AR is very popular and used for many commercial and product promotional activities. However, in almost all mobile AR application, user only view annotated information or preset virtual object's motion in AR environment and cannot interact with virtual objects as if he/she interact real objects in the real environment. In this paper, we propose novel interaction method, called integrated view-input interaction method, which integrate viewpoint moving and virtual objects handling only by handling mobile AR device. Our proposed method has a predilection for popular touch mobile devise, such as smart phone or tablet, does not need any additional sensor for sensing manipulation target. We implemented three integration types and evaluate efficiency in object handling task.\",\"PeriodicalId\":131267,\"journal\":{\"name\":\"2015 IEEE Symposium on 3D User Interfaces (3DUI)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-03-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE Symposium on 3D User Interfaces (3DUI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/3DUI.2015.7131763\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE Symposium on 3D User Interfaces (3DUI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DUI.2015.7131763","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Integrated view-input interaction method for mobile AR
Recently, mobile AR is very popular and used for many commercial and product promotional activities. However, in almost all mobile AR application, user only view annotated information or preset virtual object's motion in AR environment and cannot interact with virtual objects as if he/she interact real objects in the real environment. In this paper, we propose novel interaction method, called integrated view-input interaction method, which integrate viewpoint moving and virtual objects handling only by handling mobile AR device. Our proposed method has a predilection for popular touch mobile devise, such as smart phone or tablet, does not need any additional sensor for sensing manipulation target. We implemented three integration types and evaluate efficiency in object handling task.