{"title":"Skunk","authors":"Nicholas Augello, Arunachalam Somasundaram","doi":"10.1145/3329715.3338876","DOIUrl":"https://doi.org/10.1145/3329715.3338876","url":null,"abstract":"This talk presents DreamWorks' fur motion system Skunk which is used to produce motion for fur on characters, garments, and props. Skunk's ease of use, speed, stability, interactive nature, flexible framework, layered simulation approach, on the fly fur setup capabilities, consistency, and artist controls pushed boundaries of fur motion and interaction, and expanded artist usage at DreamWorks. The system was widely used in the film How to Train Your Dragon: The Hidden World, the short Bilby, and is being used on current feature films and shorts at DreamWorks.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124838160","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributed multi-context interactive rendering","authors":"Alex Gerveshi, S. Looper","doi":"10.1145/3329715.3338878","DOIUrl":"https://doi.org/10.1145/3329715.3338878","url":null,"abstract":"By enabling artists to work interactively with multiple active renders from a single application, new lighting and surfacing workflows were made possible. This technique was implemented by replacing Katana's interactive-render mechanism, and by leveraging Arras, DreamWorks' in-house cloud computation system.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"276 1-2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123719749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Physically based lens flare rendering in \"The Lego Movie 2\"","authors":"E. Pekkarinen, Michael Balzer","doi":"10.1145/3329715.3338881","DOIUrl":"https://doi.org/10.1145/3329715.3338881","url":null,"abstract":"We present our approach for incorporating realistic lens flare rendering in a production renderer based on a previously presented physically based lens simulation technique [Hullin et al. 2012]. We describe the approximations and sampling techniques behind efficient lens flare rendering, in addition to introducing flexible artist controls and workflows for this purpose. Using \"The Lego Movie 2: The Second Part\" as a case study, we show that these approaches are efficient and work well in a production environment.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"630 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116477752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ASWF technical advisory committee: how to enable an open source community","authors":"D. Heckenberg, Jean-François Panisset, E. Olin","doi":"10.1145/3329715.3338879","DOIUrl":"https://doi.org/10.1145/3329715.3338879","url":null,"abstract":"The Technical Advisory Committee (TAC) coordinates technical efforts in the Academy Software Foundation (ASWF)[Academy Software Foundation 2019]. Launched in August 2018, the ASWF's purpose is to support open source software development in the motion picture content creation industry. Software engineers from the Foundation's members and software projects form the TAC. It plays a key role in establishing and promulgating best practices, proposing and scrutinizing candidate projects and establishing shared resoures such as Continuous Integration infrastructure.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133352908","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Millefiori: a USD-based sequence editor","authors":"Victor Yudin, Gregory Ducatel","doi":"10.1145/3329715.3338882","DOIUrl":"https://doi.org/10.1145/3329715.3338882","url":null,"abstract":"Millefiori is a visual effects application designed to allow users to view and edit a series of large scenes, leveraging Pixar Universal Scene Description in its core, and Qt/QML for the UI components. While Millefiori was initiated as a sequence editor, its USD-based core has since formed the basis of the entire pipeline at Mill Film, from editing a USD stage to generating a review QuickTime. Development of the technology was a successful collaboration of developers led by Mill Film, MPC R&D, and Technicolor Research and Innovation.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122065887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Ring, J. Barbier, Guillaume Gales, Ben Kent, Sebastian Lutz
{"title":"Jumping in at the deep end: how to experiment with machine learning in post-production software","authors":"D. Ring, J. Barbier, Guillaume Gales, Ben Kent, Sebastian Lutz","doi":"10.1145/3329715.3338880","DOIUrl":"https://doi.org/10.1145/3329715.3338880","url":null,"abstract":"Recent years has seen an explosion in Machine Learning (ML) research. The challenge is now to transfer these new algorithms into the hands of artists and TD's in visual effects and animation studios, so that they can start experimenting with ML within their existing pipelines. This paper presents some of the current challenges to experimentation and deployment of ML frameworks in the post-production industry. It introduces our open-source \"ML-Server\" client / server system as an answer to enabling rapid prototyping, experimentation and development of ML models in post-production software. Data, code and examples for the system can be found on the GitHub repository page: https://github.com/TheFoundryVisionmongers/nuke-ML-server","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"120 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121842333","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Larry Cutler, Eric Darnell, N. Dirksen, Michael Hutchinson, Scott Peterson, R. Schiewe, Wei Wang
{"title":"Making you matter: creating interactive VR narratives through experimentation and learning","authors":"Larry Cutler, Eric Darnell, N. Dirksen, Michael Hutchinson, Scott Peterson, R. Schiewe, Wei Wang","doi":"10.1145/3329715.3338883","DOIUrl":"https://doi.org/10.1145/3329715.3338883","url":null,"abstract":"Virtual Reality (VR) is a transformative medium for narrative storytelling where content creators can place an audience member inside the story, give them a role to play, and ultimately make them matter to the characters. Immersive storytelling is fundamentally different from film and games. It requires a new creative toolset that is still in its infancy compared to other entertainment mediums. We provide a behind the scenes look at our many experimentations, failures, and learnings in developing interactive VR animated narratives spanning four released projects: Invasion!, Asteroids!, Crow: The Legend, and Bonfire. We delve into cinematic techniques for VR, including staging, movement mechanics, and directing the viewer's eye. We explore the role the viewer plays in each of our pieces. Finally, we dive into how we make you matter through nonverbal communication, interactivity that supports the narrative structure, non-linear storytelling, and character AI.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129637411","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cameron Black, Nicholas Burkard, Dmitriy V. Pinskiy
{"title":"Scriptable character fx solution","authors":"Cameron Black, Nicholas Burkard, Dmitriy V. Pinskiy","doi":"10.1145/3329715.3338877","DOIUrl":"https://doi.org/10.1145/3329715.3338877","url":null,"abstract":"We would like to present a scriptable interactive data manipulation tool, heavily used on Disney's Moana and Ralph Breaks the Internet. Its expression-driven interface makes it a versatile \"Swiss Army Knife\" for Technical Animation: a single tool with many functions, which could be applied to hair, cloth, and final cleanup tasks. Consequently, this provides a tremendous benefit for both developers and end users. The single code base can be maintained and upgraded efficiently. The artists, familiar with the tool in the context of one task, can take full advantage of the flexible interface and easily apply the tool to another task.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"2012 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132137695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sharp kelvinlets: elastic deformations with cusps and localized falloffs","authors":"F. D. Goes, Doug L. James","doi":"10.1145/3329715.3338884","DOIUrl":"https://doi.org/10.1145/3329715.3338884","url":null,"abstract":"In this work, we present an extension of the regularized Kelvinlet technique suited to non-smooth, cusp-like edits. Our approach is based on a novel multi-scale convolution scheme that layers Kelvinlet deformations into a finite but spiky solution, thus offering physically based volume sculpting with sharp falloff profiles. We also show that the Laplacian operator provides a simple and effective way to achieve elastic displacements with fast far-field decay, thereby avoiding the need for multi-scale extrapolation. Finally, we combine the multi-scale convolution and Laplacian machinery to produce Sharp Kelvinlets, a new family of analytic fundamental solutions of linear elasticity with control over both the locality and the spikiness of the brush profile. Closed-form expressions and reference implementation are also provided.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132885768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SpLit","authors":"Orde Stevanoski, Larry Gritz","doi":"10.1145/3329715.3338885","DOIUrl":"https://doi.org/10.1145/3329715.3338885","url":null,"abstract":"In this talk we present SpLit, a lighting manipulation tool created at Sony Pictures Imageworks designed to encourage experimentation and creativity when creating and manipulating CG lighting via a novel and artist friendly visual interface. We discovered that artists are often discouraged from experimenting and making sweeping lighting changes due to the complexity of the currently available user interfaces for CG lighting. Manipulating, rendering and evaluating the decisions is a cyclical process that requires multiple clicks through various UI elements, and the rendering takes time to resolve to a point where the results of the changes can be evaluated. With SpLit we focused on tightening the manipulation and decision-making loop and bringing it into the realm of realtime to encourage experimentation and creative freedom.","PeriodicalId":365444,"journal":{"name":"Proceedings of the 2019 Digital Production Symposium","volume":"189 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115596957","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}