International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality最新文献
J. P. Lima, R. Roberto, J. M. Teixeira, V. Teichrieb
{"title":"Device vs. user-perspective rendering in AR applications for monocular optical see-through head-mounted displays","authors":"J. P. Lima, R. Roberto, J. M. Teixeira, V. Teichrieb","doi":"10.1109/ISMAR.2014.6948486","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948486","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"355-356"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75247400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thomas Stütz, R. Dinic, Michael Domhardt, Simon W. Ginzinger
{"title":"A mobile augmented reality system for portion estimation","authors":"Thomas Stütz, R. Dinic, Michael Domhardt, Simon W. Ginzinger","doi":"10.1109/ISMAR.2014.6948496","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948496","url":null,"abstract":"Accurate assessment of nutrition information is an important part in the prevention and treatment of a multitude of diseases, but remains a challenging task. We present a novel mobile augmented reality application, which assists users in the nutrition assessment of their meals. The user sketches the 3D form of the food and selects the food type. The corresponding nutrition information is automatically computed.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"65 1","pages":"375-376"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90939825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Video see through AR head-mounted display for medical procedures","authors":"F. Cutolo, P. Parchi, V. Ferrari","doi":"10.1109/ISMAR.2014.6948504","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948504","url":null,"abstract":"A BSTRACT In the context of image-guided surgery (IGS), AR technology appears as a significant development in the field since it complements and integrates the concepts of surgical navigation based on virtual reality. The aim of the project is to optimize and validate an ergonomic, accurate and cheap video see-through AR system as an aid in various typologies of surgical procedures. The system will ideally have to be inexpensive and user-friendly to be successfully introduced in the clinical practice.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"31 1","pages":"393-396"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90114245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Semantic contextual augmented reality environments","authors":"Dariusz Rumiński, K. Walczak","doi":"10.1109/ISMAR.2014.6948506","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948506","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"13 1","pages":"401-404"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73385753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kazuki Matsumoto, W. Nakagawa, François de Sorbier, M. Sugimoto, H. Saito, S. Senda, Takashi Shibata, A. Iketani
{"title":"RGB-D-T camera system for AR display of temperature change","authors":"Kazuki Matsumoto, W. Nakagawa, François de Sorbier, M. Sugimoto, H. Saito, S. Senda, Takashi Shibata, A. Iketani","doi":"10.1109/ISMAR.2014.6948487","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948487","url":null,"abstract":"The anomalies of power equipment can be founded using temperature changes compared to its normal state. In this paper we present a system for visualizing temperature changes in a scene using a thermal 3D model. Our approach is based on two precomputed 3D models of the target scene achieved with a RGB-D camera coupled with the thermal camera. The first model contains the RGB information, while the second one contains the thermal information. For comparing the status of the temperature between the model and the current time, we accurately estimate the pose of the camera by finding keypoint correspondences between the current view and the RGB 3D model. Knowing the pose of the camera, we are then able to compare the thermal 3D model with the current status of the temperature from any viewpoint.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"39 4 1","pages":"357-358"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76330802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Keynote address: The role of augmented reality displays for guiding intra-cardiac interventions","authors":"T. Peters","doi":"10.1109/ISMAR.2014.6948399","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948399","url":null,"abstract":"Many inter-cardiac interventions are performed either via open-heart surgery, or using minimally invasive approaches, where instrumentation is introduced into the cardiac chambers via the vascular system or heart wall. While many of the latter procedures are often employed under x-ray guidance, for some of these xray imaging is not appropriate, and ultrasound is the preferred intra-operative imaging modality. Two such procedures involves the repair of a mitral-valve leafet, and the replacement of aortic valves. Both employ instruments introduced into the heart via the apex. For the mitral procedure, the standard of care for this procedure employs a 3D Trans-esophageal echo (TEE) probe as guidance, but using primarily its bi-plane mode, with full 3D only being used sporadically. In spite of the clinical success of this procedure, many problems are encountered during the navigation of the instrument to the site of the therapy. To overcome these diffculties, we have developed a guidance platform that tracks the US probe and instrument, and augments the US mages with virtual elements representing the instrument and target, to optimise the navigation process. Results of using this approach on animal studies have demonstrated increased performance in multiple metrics, including total tool distance from ideal pathway, total navigation time, and total tool path lengths, by factors of 3,4, and 5 respectively, as well as a 40 fold reduction in the number of times an instrument intruded into potentially unsafe zones in the heart.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"69 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91397383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Placing information near to the gaze of the user","authors":"M. Tönnis, G. Klinker","doi":"10.1109/ISMAR.2014.6948497","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948497","url":null,"abstract":"Gaze tracking facilities have yet mainly been used in general for marketing or the disabled and, more specifically, in Augmented Reality, for interaction with control triggers, such as buttons. We go one step further and use the line of sight of the user to attach information. While any information may not conceal the view of the user, we displace the information by an angular degree and provide means for the user to capture the information by looking at it. With such an apporach we see a potential for faster resuming times of the original task for which a required information needs to be accessed. The demonstration shows a comparably complex primary task assisted by our gaze-mounted information and illustrates the inherent differences for information access w.r.t. conventional methods, such as listing action items at a fix position in space or on a screen.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"3 1","pages":"377-378"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84931867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Crivellaro, Yannick Verdie, K. M. Yi, P. Fua, V. Lepetit
{"title":"Tracking texture-less, shiny objects with descriptor fields","authors":"A. Crivellaro, Yannick Verdie, K. M. Yi, P. Fua, V. Lepetit","doi":"10.1109/ISMAR.2014.6948474","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948474","url":null,"abstract":"Our demo demonstrates the method we published at CVPR this year for tracking specular and poorly textured objects, and lets the visitors experiment with it and with their own patterns. Our approach only requires a standard monocular camera (no need for a depth sensor), and can be easily integrated within existing systems to improve their robustness and accuracy. Code is publicly available.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"158 1","pages":"331-332"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75986257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz
{"title":"Mobile augmented reality - Tracking, mapping and rendering","authors":"Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Erick Méndez, S. Diaz","doi":"10.1109/ISMAR.2014.6948500","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948500","url":null,"abstract":"","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"11 1","pages":"383"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87572110","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexander Plopski, K. Kiyokawa, H. Takemura, Christian Nitschke
{"title":"Corneal imaging in localization and HMD interaction","authors":"Alexander Plopski, K. Kiyokawa, H. Takemura, Christian Nitschke","doi":"10.1109/ISMAR.2014.6948505","DOIUrl":"https://doi.org/10.1109/ISMAR.2014.6948505","url":null,"abstract":"The human eyes perceive our surroundings and are one of, if not our most important sensory organs. Contrary to our other senses the eyes not only perceive but also provide information to a keen observer. However, thus far this has been mainly used to detect reflection of infrared light sources to estimate the user’s gaze. The reflection of the visible spectrum on the other hand has rarely been utilized. In this dissertation we want to explore how the analysis of the corneal image can improve currently available eye-related solutions, such as calibration of optical see-through head-mounted devices or eye-gaze tracking and point of regard estimation in arbitrary environments. We also aim to study how corneal imaging can become an alternative for established augmented reality tasks such as tracking and localization.","PeriodicalId":92225,"journal":{"name":"International Symposium on Mixed and Augmented Reality : (ISMAR) [proceedings]. IEEE and ACM International Symposium on Mixed and Augmented Reality","volume":"53 1","pages":"397-400"},"PeriodicalIF":0.0,"publicationDate":"2014-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84454565","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}