{"title":"Joint spatial-temporal alignment of networked cameras","authors":"Chia-Yeh Lee, Tsuhan Chen, Ming-Yu Shih, Shiaw-Shian Yu","doi":"10.1109/ICDSC.2009.5289361","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a method for aligning networked cameras spatially and temporally. Synchronizing video sequences and recovering spatial information among cameras are crucial steps for applications such as robust tracking and video mosaic. Without prior knowledge of internal and external parameters of cameras, we attempt to automatically estimate their spatial relationship and time offset which is possibly caused by network transmission delay. Our main focus is on cameras with overlapping field of views. Exploiting the fact that spatial and temporal information are related, we use one to boost the other. Initially assuming no time delay, the homography between cameras can be estimated by motion detection. Based on the homography, time difference can thus be recovered by analyzing activities in overlapping regions. We iteratively use spatial and temporal information to boost each other till reaching converging criteria. The algorithm can be extend to finding spatial and temporal relationship in multiple cameras. The experiment is performed in an outdoor parking lot and it is showed that our algorithm can successfully align cameras both in space and time.","PeriodicalId":324810,"journal":{"name":"2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC)","volume":"84 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDSC.2009.5289361","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we propose a method for aligning networked cameras spatially and temporally. Synchronizing video sequences and recovering spatial information among cameras are crucial steps for applications such as robust tracking and video mosaic. Without prior knowledge of internal and external parameters of cameras, we attempt to automatically estimate their spatial relationship and time offset which is possibly caused by network transmission delay. Our main focus is on cameras with overlapping field of views. Exploiting the fact that spatial and temporal information are related, we use one to boost the other. Initially assuming no time delay, the homography between cameras can be estimated by motion detection. Based on the homography, time difference can thus be recovered by analyzing activities in overlapping regions. We iteratively use spatial and temporal information to boost each other till reaching converging criteria. The algorithm can be extend to finding spatial and temporal relationship in multiple cameras. The experiment is performed in an outdoor parking lot and it is showed that our algorithm can successfully align cameras both in space and time.