{"title":"Natural user interface for integral photography of dispersion-rendered diamond","authors":"Nahomi Maki, K. Yanaka","doi":"10.1145/2856400.2876013","DOIUrl":"https://doi.org/10.1145/2856400.2876013","url":null,"abstract":"The color of the diamond is dependent on the difference in the refractive indexes by wavelength, so we developed a dispersion rendering system using wavelength division [Maki et al. 2014]. Although various rainbow-like colors can be reproduced in the stone by this technique, another technology is necessary to reproduce the brilliance of a diamond, which is caused by the ray entered from outside that reflects and refracts many times on the surface. We introduced the extend fractional view (EFV) integral photography (IP)[Yanaka 2008], which is considered to be a display method of four-dimensional light field [Levoy et al. 1996, Gortler et al. 1996 ]. We developed an IP system reproducing a three-dimensional image that changes color depending on the direction an observer looks at in about 30 degrees of the primary viewing zone. [Maki et al. 2015] However, the observer cannot choose the direction of looking at a diamond beyond the viewing zone. To remove this limitation, we developed a more sophisticated system in which the viewer can look at the diamond from any direction they like, by naturally rotating it with his/her hand.","PeriodicalId":207863,"journal":{"name":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125336533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adaptive impulse response modeling for interactive sound propagation","authors":"Carl Schissler, Dinesh Manocha","doi":"10.1145/2856400.2856414","DOIUrl":"https://doi.org/10.1145/2856400.2856414","url":null,"abstract":"We present novel techniques to accelerate the computation of impulse responses for interactive sound rendering. Our formulation is based on geometric acoustic algorithms that use ray tracing to compute the propagation paths from each source to the listener in large, dynamic scenes. In order to accelerate generation of realistic acoustic effects in multi-source scenes, we introduce two novel concepts: the impulse response cache and an adaptive frequency-driven ray tracing algorithm that exploits psychoacoustic characteristics of the impulse response length. As compared to prior approaches, we trace relatively fewer rays while maintaining high simulation fidelity for real-time applications. Furthermore, our approach can handle highly reverberant scenes and high-dynamic-range sources. We demonstrate its application in many scenarios and have observed a 5x speedup in computation time and about two orders of magnitude reduction in memory overhead compared to previous approaches. We also present the results of a preliminary user evaluation of our approach.","PeriodicalId":207863,"journal":{"name":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125825828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hui Liang, Jin Wang, Qian Sun, Yong-Jin Liu, Junsong Yuan, Jun Luo, Ying He
{"title":"Barehanded music: real-time hand interaction for virtual piano","authors":"Hui Liang, Jin Wang, Qian Sun, Yong-Jin Liu, Junsong Yuan, Jun Luo, Ying He","doi":"10.1145/2856400.2856411","DOIUrl":"https://doi.org/10.1145/2856400.2856411","url":null,"abstract":"This paper presents an efficient data-driven approach to track fingertip and detect finger tapping for virtual piano using an RGB-D camera. We collect 7200 depth images covering the most common finger articulation for playing piano, and train a random regression forest using depth context features of randomly sampled pixels in training images. In the online tracking stage, we firstly segment the hand from the plane in contact by fusing the information from both color and depth images. Then we use the trained random forest to estimate the 3D position of fingertips and wrist in each frame, and predict finger tapping based on the estimated fingertip motion. Finally, we build a kinematic chain and recover the articulation parameters for each finger. In contrast to the existing hand tracking algorithms that often require hands are in the air and cannot interact with physical objects, our method is designed for hand interaction with planar objects, which is desired for the virtual piano application. Using our prototype system, users can put their hands on a desk, move them sideways and then tap fingers on the desk, like playing a real piano. Preliminary results show that our method can recognize most of the beginner's piano-playing gestures in realtime for soothing rhythms.","PeriodicalId":207863,"journal":{"name":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114889757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel G. Aliaga, Manuel M. Oliveira, A. Varshney, Chris Wyman
{"title":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","authors":"Daniel G. Aliaga, Manuel M. Oliveira, A. Varshney, Chris Wyman","doi":"10.1145/2856400","DOIUrl":"https://doi.org/10.1145/2856400","url":null,"abstract":"We are pleased to introduce the proceedings of the 2010 ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games (I3D 2010). Having started as a workshop in 1986, I3D became an annual event in 2005, and is now in its 14th edition. The symposium has long been recognized as a premiere forum for computer graphics and interactive techniques. In 2010, I3D was held in Bethesda, Maryland, from February 19th to 21st. \u0000 \u0000These proceedings bring to you a collection of 23 fine papers selected from 71 submissions from 19 countries in 4 continents. This was possible by the hard work of the authors, and by the dedication of 78 program committee members and 8 external reviewers. Through their diligent work, they have provided detailed reviews and engaged on long on-line discussions to guarantee a fair evaluation for all submissions. There have been no limits imposed on the number of accepted papers, and decisions were based only on the submissions' merits. Many committee members volunteered additional hours to guarantee that conditionally accepted papers could meet their full potential. We would like to congratulate the authors for their achievements, and deeply thank all reviewers for their magnificent work and devotion throughout this process. \u0000 \u0000One of the hallmarks of I3D has been its cozy environment that provides incredible opportunities for interactions among all participants. We hope you all can take the best out of these opportunities: learn from others, teach some as well, find new collaborators. Get inspired by these conversations, and by the technical content of the conference. And make sure you return next year to present your greatest new work, and for more interaction. Oh, and make sure that your colleagues get to know about the joys of I3D!","PeriodicalId":207863,"journal":{"name":"Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115623905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}