Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games最新文献

筛选
英文 中文
Carpet unrolling for character control on uneven terrain 在不平坦的地形上展开地毯来控制角色
Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games Pub Date : 2015-11-01 DOI: 10.1145/2822013.2822031
Mark Miller, Daniel Holden, R. Al-Ashqar, Christophe Dubach, Kenny Mitchell, T. Komura
{"title":"Carpet unrolling for character control on uneven terrain","authors":"Mark Miller, Daniel Holden, R. Al-Ashqar, Christophe Dubach, Kenny Mitchell, T. Komura","doi":"10.1145/2822013.2822031","DOIUrl":"https://doi.org/10.1145/2822013.2822031","url":null,"abstract":"We propose a type of relationship descriptor based on carpet unrolling that computes the joint positions of a character based on the sum of relative vectors originating from a local coordinate system embedded on the surface of a carpet. Given a terrain that a character is to walk over, the carpet is unrolled over the surface of the terrain. The carpet adapts to the geometry of the terrain and curves according to the trajectory of the character. Because trajectories of the body parts are computed as a weighted sum of the relative vectors, the character can smoothly adapt to the elevation of the terrain and the horizontal curves of the carpet. The carpet relationship descriptors are easy to parallelize and hundreds of characters can be animated in real-time by making use of the GPUs. This makes it applicable to real-time applications such as computer games.","PeriodicalId":222258,"journal":{"name":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121381693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Automatic and adaptable registration of live RGBD video streams 实时RGBD视频流的自动和适应性注册
Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games Pub Date : 2015-10-01 DOI: 10.1145/2822013.2822027
Afsaneh Rafighi, S. Seifi, Oscar E. Meruvia Pastor
{"title":"Automatic and adaptable registration of live RGBD video streams","authors":"Afsaneh Rafighi, S. Seifi, Oscar E. Meruvia Pastor","doi":"10.1145/2822013.2822027","DOIUrl":"https://doi.org/10.1145/2822013.2822027","url":null,"abstract":"We introduce DeReEs-4V, an algorithm that receives two separate RGBD video streams and automatically produces a unified scene through RGBD registration in a few seconds. The motivation behind the solution presented here is to allow game players to place the depth-sensing cameras at arbitrary locations to capture any scene where there is some partial overlap between the parts of the scene captured by the sensors. A typical way to combine partially overlapping views from multiple cameras is through visual calibration using external markers within the field of view of both cameras. Calibration can be time consuming and may require fine tuning, interrupting gameplay. If the cameras are even slightly moved or bumped into, the calibration process typically needs to be repeated from scratch. In this article we demonstrate how RGBD registration can be used to automatically find a 3D viewing transformation to match the view of one camera with respect to the other without calibration while the system is running. To validate this approach, a comparison of our method against standard checkerboard target calibration is provided, with a thorough examination of the system performance under different scenarios. The system presented supports any application that might benefit from a wider operational field-of-view video capture. Our results show that the system is robust to camera movements while simultaneously capturing and registering live point clouds from two depth-sensing cameras.","PeriodicalId":222258,"journal":{"name":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116505191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games 第八届ACM SIGGRAPH游戏运动会议论文集
Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games Pub Date : 1900-01-01 DOI: 10.1145/2822013
{"title":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","authors":"","doi":"10.1145/2822013","DOIUrl":"https://doi.org/10.1145/2822013","url":null,"abstract":"","PeriodicalId":222258,"journal":{"name":"Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129616247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信