{"title":"Illumination analysis for synthetic/natural hybrid image sequence generation","authors":"J. Stauder","doi":"10.1109/CGI.1998.694303","DOIUrl":null,"url":null,"abstract":"For virtual studio TV production, multi-site videoconference applications using a virtual meeting room and for synthetic/natural hybrid coding according to the forthcoming ISO/MPEG-4 standard, a synthetic and a natural scene are mixed to generate a synthetic/natural hybrid image sequence. For realism, the illumination in both scenes should be identical. The illumination of the natural scene is estimated automatically and applied to the synthetic scene. For illumination estimation, moving real objects are segmented in the natural image sequence and 3D modeled. The 3D shape, 3D motion and the displaced frame difference between two succeeding images are evaluated to estimate three illumination parameters. The parameters describe a distant point light source and ambient light. Using the estimated illumination parameters, the synthetic scene is rendered and mixed to the natural image sequence. Experimental results with a moving virtual ball mixed automatically into a real video telephone sequence show that virtual objects appear naturally having the same shading and shadows as the real person.","PeriodicalId":434370,"journal":{"name":"Proceedings. Computer Graphics International (Cat. No.98EX149)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1998-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Computer Graphics International (Cat. No.98EX149)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CGI.1998.694303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
For virtual studio TV production, multi-site videoconference applications using a virtual meeting room and for synthetic/natural hybrid coding according to the forthcoming ISO/MPEG-4 standard, a synthetic and a natural scene are mixed to generate a synthetic/natural hybrid image sequence. For realism, the illumination in both scenes should be identical. The illumination of the natural scene is estimated automatically and applied to the synthetic scene. For illumination estimation, moving real objects are segmented in the natural image sequence and 3D modeled. The 3D shape, 3D motion and the displaced frame difference between two succeeding images are evaluated to estimate three illumination parameters. The parameters describe a distant point light source and ambient light. Using the estimated illumination parameters, the synthetic scene is rendered and mixed to the natural image sequence. Experimental results with a moving virtual ball mixed automatically into a real video telephone sequence show that virtual objects appear naturally having the same shading and shadows as the real person.