A. Bahremand, Linda D. Nguyen, Tanya N. Harrison, R. Likamwa
{"title":"HoloLucination:一个跨移动设备的实时增强现实演示框架","authors":"A. Bahremand, Linda D. Nguyen, Tanya N. Harrison, R. Likamwa","doi":"10.1109/AIVR46125.2019.00053","DOIUrl":null,"url":null,"abstract":"We envision that in the future, presentations for business, education, and scientific dissemination can invoke 3D spatial content to immersively display and discuss animated 3-dimensional models and spatial data visualizations to large audiences. At the moment, current frameworks have targeted a highly technical user base, prohibiting the widespread curation of immersive presentations. Furthermore, solutions for real-time multi-user interactions have focused on multiplayer gaming, rather than large format immersive presentation. However, modern mobile devices (smartphones, tablets, headsets) have the capability of rendering virtual models over the physical environment through visual anchors for Augmented Reality (AR). Our ongoing research thrust is to leverage contemporary AR infrastructure to develop an easy-to use tool for users to curate and spatially present augmented presentations to large audiences. In this demo, we have built an Augmented Reality framework that allows users to curate mixed reality presentations. Our framework allows users to prepare a sequential state of animations. At the time of presentation, presenters can invoke the animations to simultaneously occur on HMDs and mobile devices.","PeriodicalId":274566,"journal":{"name":"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"HoloLucination: A Framework for Live Augmented Reality Presentations Across Mobile Devices\",\"authors\":\"A. Bahremand, Linda D. Nguyen, Tanya N. Harrison, R. Likamwa\",\"doi\":\"10.1109/AIVR46125.2019.00053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We envision that in the future, presentations for business, education, and scientific dissemination can invoke 3D spatial content to immersively display and discuss animated 3-dimensional models and spatial data visualizations to large audiences. At the moment, current frameworks have targeted a highly technical user base, prohibiting the widespread curation of immersive presentations. Furthermore, solutions for real-time multi-user interactions have focused on multiplayer gaming, rather than large format immersive presentation. However, modern mobile devices (smartphones, tablets, headsets) have the capability of rendering virtual models over the physical environment through visual anchors for Augmented Reality (AR). Our ongoing research thrust is to leverage contemporary AR infrastructure to develop an easy-to use tool for users to curate and spatially present augmented presentations to large audiences. In this demo, we have built an Augmented Reality framework that allows users to curate mixed reality presentations. Our framework allows users to prepare a sequential state of animations. At the time of presentation, presenters can invoke the animations to simultaneously occur on HMDs and mobile devices.\",\"PeriodicalId\":274566,\"journal\":{\"name\":\"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIVR46125.2019.00053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIVR46125.2019.00053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
HoloLucination: A Framework for Live Augmented Reality Presentations Across Mobile Devices
We envision that in the future, presentations for business, education, and scientific dissemination can invoke 3D spatial content to immersively display and discuss animated 3-dimensional models and spatial data visualizations to large audiences. At the moment, current frameworks have targeted a highly technical user base, prohibiting the widespread curation of immersive presentations. Furthermore, solutions for real-time multi-user interactions have focused on multiplayer gaming, rather than large format immersive presentation. However, modern mobile devices (smartphones, tablets, headsets) have the capability of rendering virtual models over the physical environment through visual anchors for Augmented Reality (AR). Our ongoing research thrust is to leverage contemporary AR infrastructure to develop an easy-to use tool for users to curate and spatially present augmented presentations to large audiences. In this demo, we have built an Augmented Reality framework that allows users to curate mixed reality presentations. Our framework allows users to prepare a sequential state of animations. At the time of presentation, presenters can invoke the animations to simultaneously occur on HMDs and mobile devices.