Hiroyuki Adachi, Akimune Haruna, Seiko Myojin, N. Shimada
{"title":"ScoringTalk and WatchingMeter: utterance and gaze visualization for co-located collaboration","authors":"Hiroyuki Adachi, Akimune Haruna, Seiko Myojin, N. Shimada","doi":"10.1145/2818427.2818455","DOIUrl":"https://doi.org/10.1145/2818427.2818455","url":null,"abstract":"In order to enhance communication, various ways for supporting communication have been researched [Terken and Sturm 2010; Bergstrom and Karahalios 2007]. However, most of these works are difficult to set up because these works need special things, for example, having or wearing a microphone, a room equipped with a projector. On the other hand, our system [Adachi et al. 2014] only requires devices with two cameras and a display such as tablets and smartphones since the devices can both sensing and visualizing, and popular, therefore the system has the advantage of being easy to use. In addition, our system can provide different (controlled) information to the individual since each participant has the own display. We consider the system is useful in brainstorming, group meetings, tabletop games with conversation, and so on.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133122246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mobile - based streaming system for omnidirectional contents","authors":"Masanori Hironishi, Wataru Motomura, Tomohito Yamamoto","doi":"10.1145/2818427.2818435","DOIUrl":"https://doi.org/10.1145/2818427.2818435","url":null,"abstract":"Many types of display systems have been developed for providing a spatial viewing experience, and surround sound systems, for expressing high levels of presence. However, these types of visual or auditory display systems sometimes require the allocation of large spaces for fixed, specialized equipment, and they tend to be expensive. On the other hand, mobile devices such as smartphones and tablets are now widespread. Thus, it may be possible to build an immersive reality system on mobile devices, which users can experience at any time and in any place.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124089807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Watten, Marco Gilardi, Patrick Holroyd, Paul F. Newbury
{"title":"MAVIS: Mobile Acquisition and VISualization: a professional tool for video recording on a mobile platform","authors":"P. Watten, Marco Gilardi, Patrick Holroyd, Paul F. Newbury","doi":"10.1145/2818427.2818448","DOIUrl":"https://doi.org/10.1145/2818427.2818448","url":null,"abstract":"Professional video recording is a complex process which often requires expensive cameras and large amounts of ancillary equipment. With the advancement of mobile technologies, cameras on mobile devices have improved to the point where the quality of their output is sometimes comparable to that obtained from a professional video camera and are often used in professional productions. However, tools that allow professional users to access the information they need to control the technical quality of their filming and make an informed decision about what they are recording are missing on mobile platforms. In this paper we present MAVIS (Mobile Acquisition and VISualization) a tool for professional filming on a mobile platform. MAVIS allows users to access information such as colour vectorscope, waveform monitor, false colouring, focus peaking and all other information that is needed to produce high quality professional videos. This is achieved by exploiting the capabilities of modern mobile GPUs though the use of a number of vertex and fragment shaders. Evaluation with professionals in the film industry shows that the app and its functionalities are well received and that the output and usability of the application align with professional standards.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127688015","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented reality using high fidelity spherical panorama with HDRI: demonstration","authors":"Zi Siang See, M. Billinghurst, A. Cheok","doi":"10.1145/2818427.2819696","DOIUrl":"https://doi.org/10.1145/2818427.2819696","url":null,"abstract":"This demonstration presents an experimental method and apparatus configuration for producing spherical panoramas with high dynamic range imaging (HDRI). Our method is optimized for providing high fidelity augmented reality (AR) image-based environment recognition for mobile devices. We developed HDRI method that requires single acquisition which extends dynamic range from digital negative, this approach is to be used for multiple angles necessary for reconstructing accurately reproduced spherical panorama with sufficient luminance.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115761273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Nittala, Nico Li, S. Cartwright, Kazuki Takashima, E. Sharlin, M. Sousa
{"title":"PLANWELL: spatial user interface for collaborative petroleum well-planning","authors":"A. Nittala, Nico Li, S. Cartwright, Kazuki Takashima, E. Sharlin, M. Sousa","doi":"10.1145/2818427.2818443","DOIUrl":"https://doi.org/10.1145/2818427.2818443","url":null,"abstract":"We present our prototype of PlanWell, a spatial augmented reality interface that facilitates collaborative field operations. PlanWell allows a central overseer (in a command and control center) and a remote explorer (an outdoor user in the field) to explore and collaborate within a geographical area. PlanWell provides the overseer with a tangible user interface (TUI) based on a 3D printout of surface geography which acts as a physical representation of the region to be explored. Augmented reality is used to dynamically overlay properties of the region as well as the presence of the remote explorer and their actions on to the 3D representation of the terrain. The overseer is able to perform the actions directly on the TUI and then the overseer's actions are presented as dynamic AR visualizations superimposed on the explorer's view in the field. Although our interface could applied to many domains, the PlanWell prototype was developed to facilitate petroleum engineering tasks such as well planning and coordination of drilling operations. Our paper describes the details of the design and implementation of the current PlanWell prototype in the context of petroleum well planning and drilling, and discusses some of the preliminary reflections of two focus group sessions with domain experts.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120903899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A platform for mobile augmented reality app creation without programming","authors":"Yiqun Li, Aiyuan Guo, Ching-Ling Chin","doi":"10.1145/2818427.2818452","DOIUrl":"https://doi.org/10.1145/2818427.2818452","url":null,"abstract":"There are many application areas on using smartphone to access relevant information. By taking a picture from a physical object using a smartphone app, we use image recognition technology to provide a quick link between the physical object and its relevant information. However, developing smartphone apps is expensive and time consuming. We developed a platform called MIMAS AR Creator, which is a web based software platform for automatic creation of smartphone apps for multimedia access using pictures captured from the smartphone camera and its GPS location. This platform allows people without programming skill to create a smartphone app in a few minutes with existing multimedia contents, shorten more than 90% of the app development time. The digital contents can be web pages, videos, audios, images, or 3D graphics with or without animation etc. The platform can be used for mobile advertising and retail marketing, mobile learning and tour guide etc. For example, with the created app running, people can point their phone camera to a picture on newspaper, product brochure, or physical product to obtain more relevant information provided by the advertisers or vendors. They can also point the phone camera to a building or monument to retrieve relevant historical information.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116035322","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Takao Kakimori, Makoto Okabe, Keiji Yanai, R. Onai
{"title":"A system to support the amateurs to take a delicious-looking picture of foods","authors":"Takao Kakimori, Makoto Okabe, Keiji Yanai, R. Onai","doi":"10.1145/2818427.2818451","DOIUrl":"https://doi.org/10.1145/2818427.2818451","url":null,"abstract":"Recently, many people take a picture of foods at home or in restaurants, and upload the picture to a social networking service (SNS) to share it with friends. People want to take a delicious-looking picture of foods, but it is often difficult, because most of them have no idea how to take a delicious-looking picture. There are many photography techniques for composition[Liu et al. 2010], lighting, color, focus, etc, and the techniques used to take a picture are different for different types of subjects. The problem lies in the difficulty for amateur photographers to choose and apply appropriate ones from such many techniques. In this paper, we pay attention to composition and develop a system to support the amateurs to take a delicious-looking picture of foods in a short time. Our target users are the amateurs of food photography and our target photographic subjects are foods on dishes. There are four steps to take a picture using our system: 1) our system automatically recognizes foods on dishes; 2) our system suggests the composition and the camera tilt, by which the user can take a delicious-looking picture; 3) the user arranges foods and dishes on the table, and set the camera position and tilt; 4) finally, the user takes the picture.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131785551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chih-Hsiang Yu, Wen-Wei Peng, Shys-Fan Yang-Mao, Yuan Wang, W. Chinthammit, H. Duh
{"title":"A hand gesture control framework on smart glasses","authors":"Chih-Hsiang Yu, Wen-Wei Peng, Shys-Fan Yang-Mao, Yuan Wang, W. Chinthammit, H. Duh","doi":"10.1145/2818427.2819695","DOIUrl":"https://doi.org/10.1145/2818427.2819695","url":null,"abstract":"Nowadays, in order to overcome limitations of WIMP interaction, many novel emerging user interfaces have been discussed, such as multi-touch user interfaces [Reisman et al. 2009], tangible user interfaces (TUIs) [Jordà et al. 2007], organic user interfaces (OUIs) [Koh et al. 2011], and mid-air gesture detection [Benko and Wilson 2010]. These technologies have the potential to significantly impact on marketing in the area of smart TVs, desktops, mobile phones, tablets and wearable devices such as smart watches and smart glasses. As we know, Google Glass, a type of wearable device, which only provides a touch pad, located on the right side of the device, which can use touch gestures by simple tapping and sliding your finger on it. Hand gesture is not only one of powerful human-to-human communication modalities [Chen et al. 2007], but also can change the way with human-computer interaction. Therefore, implementing a hand gesture control framework on the glasses could provide an easy-to-use, intuitive and flexibility of interaction approach. In this paper, we proposed a hand gesture control framework on smart glasses that supported various fancy gesture controls. The user can load a virtual 3D object through his fingers just like the magician's trick; rotate the virtual 3D object by moving his hand; zoom the virtual 3D object by using a particular gesture sign.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134000021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fabio Zünd, Mattia Ryffel, Stéphane Magnenat, A. Marra, Maurizio Nitti, Mubbasir Kapadia, Gioacchino Noris, Kenny Mitchell, M. Gross, R. Sumner
{"title":"Augmented creativity: bridging the real and virtual worlds to enhance creative play","authors":"Fabio Zünd, Mattia Ryffel, Stéphane Magnenat, A. Marra, Maurizio Nitti, Mubbasir Kapadia, Gioacchino Noris, Kenny Mitchell, M. Gross, R. Sumner","doi":"10.1145/2818427.2818460","DOIUrl":"https://doi.org/10.1145/2818427.2818460","url":null,"abstract":"Augmented Reality (AR) holds unique and promising potential to bridge between real-world activities and digital experiences, allowing users to engage their imagination and boost their creativity. We propose the concept of Augmented Creativity as employing ar on modern mobile devices to enhance real-world creative activities, support education, and open new interaction possibilities. We present six prototype applications that explore and develop Augmented Creativity in different ways, cultivating creativity through ar interactivity. Our coloring book app bridges coloring and computer-generated animation by allowing children to create their own character design in an ar setting. Our music apps provide a tangible way for children to explore different music styles and instruments in order to arrange their own version of popular songs. In the gaming domain, we show how to transform passive game interaction into active real-world movement that requires coordination and cooperation between players, and how ar can be applied to city-wide gaming concepts. We employ the concept of Augmented Creativity to authoring interactive narratives with an interactive storytelling framework. Finally, we examine how Augmented Creativity can provide a more compelling way to understand complex concepts, such as computer programming.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129326841","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chih-Hsiang Yu, Wen-Wei Peng, Shys-Fan Yang-Mao, Yuan Wang, W. Chinthammit, H. Duh
{"title":"A hand gesture control framework on smart glasses","authors":"Chih-Hsiang Yu, Wen-Wei Peng, Shys-Fan Yang-Mao, Yuan Wang, W. Chinthammit, H. Duh","doi":"10.1145/2818427.2818444","DOIUrl":"https://doi.org/10.1145/2818427.2818444","url":null,"abstract":"In this paper, we proposed a hand gesture control framework on smart glasses. Three different camera structures were presented to detect the hand portion, and the Moore's Neighbor tracing algorithm detects the hand contour more efficiently and automatically. We not only refined the skin-color model but also improved the Chamfer matching method for the robust and effective gesture recognition. A demonstration has been implemented by using the hand gesture control framework. Several gestures are pre-defined for various functions, such as selecting a virtual 3D object, rotating, zooming in or zooming out, and changing display properties of the 3D object.","PeriodicalId":328982,"journal":{"name":"SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications","volume":"139 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122839082","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}