{"title":"A content-based synchronization approach for timing description in EnhancedTV","authors":"Hyun-Jeong Yim, Y. Choy, Soon-Bum Lim","doi":"10.1145/1666778.1666786","DOIUrl":"https://doi.org/10.1145/1666778.1666786","url":null,"abstract":"In EnhancedTV (ETV) where diverse multimedia data appear in scenes, synchronization between the video stream and other multimedia data, including graphics, is essential. Currently, the most widely used method of synchronizing video streams and other multimedia data in ETV is to match absolute time values with each other.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126941636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Climate shifts","authors":"Christa Erickson","doi":"10.1145/1665137.1665148","DOIUrl":"https://doi.org/10.1145/1665137.1665148","url":null,"abstract":"We are all affected by our immediate context and its political, environmental, cultural, and temporal influences. Climate Shifts juxtaposes various locations around the globe and their points of view through local news headlines and weather data. Differences in concerns and perspectives emerge, sometimes about the same world events, allowing a glimpse into the collective psyche of each place.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"24 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130637905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Assassin's Creed 2","authors":"István Zorkóczy","doi":"10.1145/1665208.1665212","DOIUrl":"https://doi.org/10.1145/1665208.1665212","url":null,"abstract":"Immersion in the world and intrigues of the Assassin's Creed 2 videogame, the direct sequel to Assassin's Creed, which sold eight million units worldwide. Dive into the Italian Renaissance and beautifully recreated 15th century Venice, in the midst of a mysterious street carnival, where you will meet our new master assassin, Ezio Auditore da Firenze, and discover his new \"art.\" Follow him on his quest for vengeance to reveal a secular conspiracy and fight the masquerades of the Italian Renaissance.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"196 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131182199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AMF caterpillar","authors":"Filip Engstrom","doi":"10.1145/1665208.1665210","DOIUrl":"https://doi.org/10.1145/1665208.1665210","url":null,"abstract":"The Mill Los Angeles team of 3D artists relished the opportunity to work on the latest commercial for Swedish pension company AMF. Filip Engstrom directed the spot, which features a host of fully CG, photo-real insects. The star of the ad is a caterpillar who becomes forlorn until he transforms into a butterfly.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132564481","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A contrast perception matching based HDR tone-mapping operator","authors":"Zhongkang Lu, S. Rahardja","doi":"10.1145/1666778.1666823","DOIUrl":"https://doi.org/10.1145/1666778.1666823","url":null,"abstract":"In 2008, Mantiuk et al proposed a display adaptive tone-mapping operator [Mantiuk et al. 2008]. It adapted Daly's visible differences predictor [Daly 1993] and Wilson's transducer function [Wilson 1980; Wilson and Gelb 1984] to measure the response of contrast in human visual system. The transducer function is expressed as:\u0000 where ΦQ(C) represents the response of a contrast stimulus. C is the physical contrast of a stimulus, and S is the sensitivity of this type of contrast stimuli with certain frequency, orientation, background luminance values, etc. Q is a empirical parameter that falls in a wide range of [2.0, 6.0] across different experiments. It suggests the value of Q is adaptive to experimental conditions. A simple adaption of Equ. 1 with fixed Q may bring unexpected errors.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128300626","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Crowd simulation in Astroboy","authors":"E. Tse, Justin Lo","doi":"10.1145/1667146.1667202","DOIUrl":"https://doi.org/10.1145/1667146.1667202","url":null,"abstract":"One of the major technical challenges in the animated film Astroboy was the creation of believable crowds of town citizens in the battle arena scene. In order to speed up the performance of converting citizen behavior into RIB format, the pre-baked RIB method proposed in Tutorial on Procedural Primitives [Hery and Sutton 2001] was applied. By using motion-editing techniques, our crowd characters are able to interact with the environment efficiently. Making use of techniques of Procedural RIB Generation, lighting artists were able to apply secondary masks on crowd characters, and the new motion which is generated on-the-fly during simulation can be rendered efficiently.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"160 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131730918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial nature: fluid space","authors":"H. Ji, Graham Wakefield","doi":"10.1145/1665137.1665153","DOIUrl":"https://doi.org/10.1145/1665137.1665153","url":null,"abstract":"How does artificial-life art adapt to its environment? What is the significance of a computational ecosystem proposed as contemporary art? These are some of the ideas examined in this bio-inspired immersive art installation.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115388450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Virtual Haptic Radar","authors":"A. Zerroug, Á. Cassinelli, M. Ishikawa","doi":"10.1145/1667146.1667158","DOIUrl":"https://doi.org/10.1145/1667146.1667158","url":null,"abstract":"We present here a first prototype of the Virtual Haptic Radar (VHR), a wearable device helping actors become aware of the presence of invisible virtual objects in their path when evolving in a virtual studio (such as a \"bluescreen\" filming stage [Figure 1]). The VHR is a natural extension of the Haptic Radar (HR) and its principle [Cassinelli et al. 2006] in the realm of virtual reality: while each module of the HR had a small vibrator and a rangefinder to measure distance to real obstacles, the VHR module lacks the rangefinder but accommodates instead a (cheap) ultrasound-based indoor positioning system that gives it the ability to know exactly where it is situated relatively to an external frame of reference.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115278589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Life at the witch trails","authors":"Natalie Bewernitz, Marek Goldowski","doi":"10.1145/1665137.1665143","DOIUrl":"https://doi.org/10.1145/1665137.1665143","url":null,"abstract":"Life at the witch trails is based on the idea of creating \"living\" structures through sound. Video material from x/y-stereo displays visualizes the phase changing from two-channel audio signals. The sound source is a special audio composition that can not be realized without direct visualization. It contains full-on, sound-dependent motion dynamics and forms complex \"cathode-ray objects\", which allow direct (not delayed) visual access to the smallest details of the composition. The representation is not limited by the pictures-per-second time frame of television and computer technology. The interconnection of the aural and visual senses arises in an immediate way, and the visualization of sound obtains a new meaning.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"464 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125823670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Toshihisa Yamahata, Yuuki Uranishi, H. Sasaki, Y. Manabe, K. Chihara
{"title":"Glanular materials rendering based on radiance caching","authors":"Toshihisa Yamahata, Yuuki Uranishi, H. Sasaki, Y. Manabe, K. Chihara","doi":"10.1145/1666778.1666817","DOIUrl":"https://doi.org/10.1145/1666778.1666817","url":null,"abstract":"The animation of such granular material as sand and grain is difficult to render in computer graphics. There are no standard methods for modeling their motions, because their non-linear behavior, which is ruled by the loss of energy, occurs when the discrete particles collide. Therefore, methods for rendering granular materials have not been studied widely. In this paper, we present a new method for rendering photo-realistic granular materials. To model the motion of granular material, we define granular materials as the mass of discrete particles. Then the scattering of light needed for photo-realistic rendering is formulated, and we propose method for efficiently evaluating radiance in granular material based on a radiance caching method.","PeriodicalId":180587,"journal":{"name":"ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126984154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}