{"title":"利用时空gabor滤波的实时视频风格化","authors":"Rui Wang, Ping Li, Bin Sheng, Hanqiu Sun, E. Wu","doi":"10.1145/3013971.3013986","DOIUrl":null,"url":null,"abstract":"This paper describes a new video stylization approach that achieves non-photorealistic rendering effects by using highly efficient spatial-temporal Gabor filtering. An edge extraction algorithm is developed to detect long coherent edges, to which the human visual system is sensitive. A nonlinear diffusion is then applied to remove unimportant details. Our approach extends the optical flow computation for constructing the Gabor flow to represent pixel similarity, and to preserve the temporal coherence when applied to video sequences. In particular, our video stylization is designed in a spatiotemporal manner to achieve temporal coherence in resulting animations. Real-time performance is achieved through the highly parallel implementation on modern graphics hardware (GPU). Therefore, our video stylization can be naturally applied to real-time video communication and interactive video-based rendering. The experimental results have demonstrated the high-quality production of our real-time video stylization.","PeriodicalId":269563,"journal":{"name":"Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Real-time video stylization using spatial-temporal gabor filtering\",\"authors\":\"Rui Wang, Ping Li, Bin Sheng, Hanqiu Sun, E. Wu\",\"doi\":\"10.1145/3013971.3013986\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper describes a new video stylization approach that achieves non-photorealistic rendering effects by using highly efficient spatial-temporal Gabor filtering. An edge extraction algorithm is developed to detect long coherent edges, to which the human visual system is sensitive. A nonlinear diffusion is then applied to remove unimportant details. Our approach extends the optical flow computation for constructing the Gabor flow to represent pixel similarity, and to preserve the temporal coherence when applied to video sequences. In particular, our video stylization is designed in a spatiotemporal manner to achieve temporal coherence in resulting animations. Real-time performance is achieved through the highly parallel implementation on modern graphics hardware (GPU). Therefore, our video stylization can be naturally applied to real-time video communication and interactive video-based rendering. The experimental results have demonstrated the high-quality production of our real-time video stylization.\",\"PeriodicalId\":269563,\"journal\":{\"name\":\"Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3013971.3013986\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3013971.3013986","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-time video stylization using spatial-temporal gabor filtering
This paper describes a new video stylization approach that achieves non-photorealistic rendering effects by using highly efficient spatial-temporal Gabor filtering. An edge extraction algorithm is developed to detect long coherent edges, to which the human visual system is sensitive. A nonlinear diffusion is then applied to remove unimportant details. Our approach extends the optical flow computation for constructing the Gabor flow to represent pixel similarity, and to preserve the temporal coherence when applied to video sequences. In particular, our video stylization is designed in a spatiotemporal manner to achieve temporal coherence in resulting animations. Real-time performance is achieved through the highly parallel implementation on modern graphics hardware (GPU). Therefore, our video stylization can be naturally applied to real-time video communication and interactive video-based rendering. The experimental results have demonstrated the high-quality production of our real-time video stylization.