{"title":"Touch-less interaction smartphone on go!","authors":"Zhihan Lv, S. Réhman","doi":"10.1145/2542302.2542336","DOIUrl":"https://doi.org/10.1145/2542302.2542336","url":null,"abstract":"A smartphone touch-less interaction based on mixed hardware and software is proposed in this work. The software application renders circle menu application graphics and status information using smart phone's screen, audio. Augmented reality image rendering technology is employed for a convenient finger-phone interaction. The users interact with the application using finger gesture motion behind the camera, which trigger the interaction event and generate activity sequences for interactive buffers. The combination of Contour based Template Matching (CTM) and Tracking-Learning-Detection (TLD) provides a core support for hand-gesture interaction by accurately detecting and tracking the hand gesture.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116775338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Concert viewing headphones","authors":"M. Hamanaka, SeungHee Lee","doi":"10.1145/2542302.2542304","DOIUrl":"https://doi.org/10.1145/2542302.2542304","url":null,"abstract":"We designed concert viewing headphones that are equipped with a projector, an inclination sensor on the top of the headphones, and a distance sensor on the outside right speaker (Figure 1). Previously reported headphones with sensors for detecting the direction the user is facing or the location of the head can escalate the musical presence and create a realistic impression, but they do not control the volumes and panoramic potentiometers of each part in accordance with the user's wishes [Pachet and Delerue 2000]. We previously developed sound scope headphones that enable users to change the sound mixing depending on their head direction [Hamanaka and Lee 2009]. However, the system did not have handle images. In contrast, our headphones let a user listening and watching to music scope a particular part that he or she wants to hear and see. For example, when listening to jazz, one might want to clearly hear and see the guitar or sax. By moving your head left or right, you can hear the guitar or sax sound from a frontal position. By simply putting your hand behind your ear, you can adjust the distance sensor on the headphones and focus on a particular part you want to hear and see.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121498191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Beata Turoňová, L. Marsalek, Tomás Davidovic, P. Slusallek
{"title":"Progressive stochastic reconstruction technique for cryo electron tomography","authors":"Beata Turoňová, L. Marsalek, Tomás Davidovic, P. Slusallek","doi":"10.1145/2542302.2542316","DOIUrl":"https://doi.org/10.1145/2542302.2542316","url":null,"abstract":"Cryo Electron Tomography (cryoET) plays an essential role in Structural Biology, as it is the only technique that allows to study the structure and intracellular distribution of large macromolecular complexes in their (close to) native environment. A major limitation of cryoET is the highest achievable resolution, currently at around 3 nm, which prevents its application to smaller complexes and in turn to a wider range of important biological questions.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130588790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ArtTorchlight: an exploratory way of viewing paintings using handheld projectors and smart phones","authors":"Chieh-Ming Chang, Yennun Huang","doi":"10.1145/2542302.2542345","DOIUrl":"https://doi.org/10.1145/2542302.2542345","url":null,"abstract":"Handheld projectors have lately been investigated as mobile interactive displays devices. In this paper we present ArtTorchlight, a novel interactive device to explore large digitalized paintings. Combining with handheld projectors and smart phones, ArtTorchlight is able to sense user gestures and utilize these sensing data to display explored content. Finally we create a scenario where visitors can use this device in an exploratory and virtual way to view large paintings in a museum.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"63 12","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114042071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sparse BRDF approximation using compressive sensing","authors":"Beno Zupančič, C. Soler","doi":"10.1145/2542302.2542352","DOIUrl":"https://doi.org/10.1145/2542302.2542352","url":null,"abstract":"BRDF acquisition is a tedious operation, since it requires measuring 4D data. On one side of the spectrum lie explicit methods, which perform many measurements to potentially produce very accurate reectance data after interpolation [Matusik et al. 2003]. These methods are generic but practically difficult to setup and produce high volume data. On the other side, acquisition methods based on parametric models implicitly reduce the infinite dimensionality of the BRDF space to the number of parameters, allowing acquisition with few samples. However, parametric methods require non linear optimization. They become unstable when the number of parameters is large, with no guaranty that a given parametric model can ever fit particular measurements.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122136489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shen-Chi Chen, Chia-Wei Hsu, Shih-Yao Lin, Kevin Lin, Y. Hung
{"title":"Teleport: space navigation by detecting the self-motion of a mobile device","authors":"Shen-Chi Chen, Chia-Wei Hsu, Shih-Yao Lin, Kevin Lin, Y. Hung","doi":"10.1145/2542302.2542310","DOIUrl":"https://doi.org/10.1145/2542302.2542310","url":null,"abstract":"Inspired by the Boo's doors in the Disney's cartoon Monsters Inc. which are used as pathways to different spaces, we propose the technique \"Teleport\" simulating a virtual gateway for users to go anywhere they want. Users can explore the space using one mobile device with the first person navigation to enhance the immersive experience of the space. To support the first person navigation inside the space 3D model we created, mobile-vision and built-in orientation sensors allow mobile device to detect the self-motion including user's view angles (pitch/roll/yaw), movement (forward/backward/stay) and moving speed. After assessing these inputs, the visualization of the virtual space can exactly reflect the corresponding view according the user's movement. Since our method does not require any outside infrastructure, it can be applied to any existing smartphone or Google glass. Therefore, this technique is not only promising and economically practical to many applications but also good for environments interaction.","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128150453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SIGGRAPH Asia 2013 Posters","authors":"","doi":"10.1145/2542302","DOIUrl":"https://doi.org/10.1145/2542302","url":null,"abstract":"","PeriodicalId":269059,"journal":{"name":"SIGGRAPH Asia 2013 Posters","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126497084","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}