Goshiro Yamamoto, I. Kanaya, Keiko Yamamoto, Yuuki Uranishi, H. Kato
{"title":"Visualization of geometric properties of flexible objects for form designing","authors":"Goshiro Yamamoto, I. Kanaya, Keiko Yamamoto, Yuuki Uranishi, H. Kato","doi":"10.1109/ISMAR.2011.6143900","DOIUrl":"https://doi.org/10.1109/ISMAR.2011.6143900","url":null,"abstract":"Computer-aided design (CAD) system conventionally have been widely used to support designers for creating, modifying, adding something to or removing something from objects by showing simulated objects on computer screen. These virtual, non-physical, objects have been, however, known as imperfect imitation of reality. The impression of shape is highly related to the second order derivative of geometric feature of the shape. Conventional CAD systems, including AutoCAD, usually have visualization feature of the first derivative (normal) and the second derivative (curvature) of given surfaces. There, however, still have been problems in curvature visualization on the screen. First, it lacks true feeling of physical objects. Second, even if designers were given a physical mock-up object in hand, they wouldn't precisely recognize minute change of curvatures — few designers can sense small differences of curvature and most others need a special device to check the curvature. For solving these problem, the authors propose a novel curvature visualization system based on mixed reality technology. The color mapping according to the Gaussian curvature calculated via a time-of-flight camera provides the observers with intuitively understanding the object's curvature information.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132097905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An interactive augmented reality coloring book","authors":"Adrian Clark, Andreas Dünser","doi":"10.1145/2073370.2073394","DOIUrl":"https://doi.org/10.1145/2073370.2073394","url":null,"abstract":"Creating entertaining and educational books not only requires providing visually stimulating content but also means for students to interact, create, and express themselves. In this paper we present a new type of mixed-reality book experience, which augments an educational coloring book with user-generated three dimensional content. We explore a “pop-up book” metaphor and describe a process by which children's drawing and coloring is used as input to generate and change the appearance of the book content. Our system is based on natural feature tracking and image processing techniques that can be easily exploited for other AR publishing applications.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"304 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134214319","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fusing the real and the virtual: A depth-camera based approach to Mixed Reality","authors":"P. Lensing, W. Broll","doi":"10.1109/ISMAR.2011.6143892","DOIUrl":"https://doi.org/10.1109/ISMAR.2011.6143892","url":null,"abstract":"The seamless integration of the real and the virtual content is the ultimate yet unreached goal of Mixed Reality applications. Among others it requires mutual blocking and lighting between real and virtual objects. In this paper we present our approach of applying a low-cost depth camera, such as Kinect, allowing for an easy acquisition of depth images. However, as the quality of the raw input data is insufficient for this purpose, we apply a series of filter and optimization operations. This allows us to realize mutual real-time lighting and rigid interaction in a dynamic environment. Our approach produces an acceptable quality of images of low-frequency scenes at interactive frame rates on an off-the-shelf desktop computer.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131338567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Psychological keys to success in MAR systems","authors":"M. Neal, J. Cabiria, J. Hogg","doi":"10.1109/ismar.2011.6162858","DOIUrl":"https://doi.org/10.1109/ismar.2011.6162858","url":null,"abstract":"Imagine a significant improvement in understanding user response to your application. This tutorial examines psychological considerations in developing and deploying MAR applications to help improve your understanding of key psychological factors in technological development. Successful MAR applications will be those that take advantage of the inherent way our brains process information in this new environment. This panel provides a one half-day session where media-focused psychology is explored at a basic level with numerous illustrations, examples, and techniques to ensure that the MAR application has a strong technology-mind interface. The purpose of this tutorial is to help define the role psychological research and theories play in the successful development and deployment of commercial MAR applications. It contains four major sections of study for session participants. The first three are: 1) cognitive science, 2) psychological design, and 3) narrative transportation theory for applications. There is considerable research in user experience (UX), and there are many lessons to be learned as MAR technology moves into the mainstream. Each section deconstructs core foundational components in the understanding of user-experience (UX) with regard to new interactive technologies. The final section ties the first three sections together and provides practical tips and techniques to help MAR researchers and practitioners take the psychological sciences into their labs, design, and development shops. Throughout this tutorial, attendees will view various examples of each of the topics, as well as be exposed to a variety of similarities and differences in cognitive interpretation of object and environment design. By the end of this session, attendees will have a better understanding of how to design for diverse end-users, as well as how to capture and hold end-user attention for the ultimate purpose of engagement and immersion.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124475882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
U. Bockholt, U. Vogel, R. Herold, P. Schreiber, Sascha Voth
{"title":"Bi-directional OLED microdisplay for see-through HMD","authors":"U. Bockholt, U. Vogel, R. Herold, P. Schreiber, Sascha Voth","doi":"10.1109/ISMAR.2011.6092356","DOIUrl":"https://doi.org/10.1109/ISMAR.2011.6092356","url":null,"abstract":"Within the research project “iStar — Interactive See-Through Augmented-Reality Display” the Fraunhofer Gesellschaft developed Augmented Reality goggles comprising a VGA OLED microdisplay with embedded image sensor aimed on gaze-control and see-through head-mounted optics. The active area of the bi-directional microdisplay consists of nested display and image sensor (embedded camera) pixels surrounded by a second image sensor (frame camera) as well as driving and control circuitry (c.f. Table). The display and image sensor systems are electrically independent of one another, simply interacting via synchronization signals. iStar also includes a developer kit integrating Eye-Tracking software, AR system and application demonstrators. The topic of high-contrast See-Through HMDs still forms an important research topic within the AR community, thereby iStar not only offers a light-weight display solution but it also integrates camera sensors into the display to support Eye-Tracking. With the presented demonstrators the tutorial attendees can evaluate the possibilities and the maturity of the developed technologies. The attendees will get an overview to requirements, solution possibilities and open research topics in the field of microdisplays, optics and software development for the realization of interactive see-Through HMDs. Feedback to learned objectives will be evaluated in questioners.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132545151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steven K. Feiner, Thommen Korah, David J. Murphy, Vasu Parameswaran, Matei Stroila, Sean White
{"title":"Enabling large-scale outdoor mixed reality and augmented reality","authors":"Steven K. Feiner, Thommen Korah, David J. Murphy, Vasu Parameswaran, Matei Stroila, Sean White","doi":"10.1109/ISMAR.2011.6092359","DOIUrl":"https://doi.org/10.1109/ISMAR.2011.6092359","url":null,"abstract":"While there is significant recent progress in technologies supporting augmented reality for small indoor environments, there is still much work to be done for large outdoor environments. This workshop focuses primarily on research that enables high-quality outdoor Mixed Reality (MR) and Augmented Reality (AR) applications. These research topics include, but are not restricted to: — 3D geo-referenced data (images, point clouds, and models) — Algorithms for object recognition from large databases of geo-referenced data — Algorithms for object tracking in outdoor environment — Multi-cue fusion to achieve improved performance of object detection and tracking — Novel representation schemes to facilitate large-scale content distribution — 3D reasoning to support intelligent augmentation — Novel and improved mobile capabilities for data capture (device sensors), processing, and display — Applications, experiences, and user interface techniques. The workshop will also showcase existing prototypes of applications enabled by these technologies: mirror worlds, high-fidelity virtual environments, applications of panoramic imagery, and user studies relating to these media types. This workshop aims to bring together academic and industrial researchers and to foster discussion amongst participants on the current state of the art and future directions for technologies that enable large-scale outdoor MR and AR applications. The workshop will start with a session in which position statements and overviews of the state of the art are presented. In the afternoon, we will follow up with discussion sessions and a short closing session.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123210061","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}