2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras最新文献

筛选
英文 中文
End-user viewpoint control of live video from a medical camera array 终端用户视点控制实时视频从医疗摄像机阵列
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042917
Jeffrey R. Blum, Haijian Sun, Adriana Olmos, J. Cooperstock
{"title":"End-user viewpoint control of live video from a medical camera array","authors":"Jeffrey R. Blum, Haijian Sun, Adriana Olmos, J. Cooperstock","doi":"10.1109/ICDSC.2011.6042917","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042917","url":null,"abstract":"The design and implementation of a camera array for real-time streaming of medical video across a high speed research network is described. Live video output from the array, composed of 17 Gigabit Ethernet cameras, must be delivered in low-latency, simultaneously, to many students at geographically disparate locations. The students require dynamic control over their individual viewpoints not only from physical camera positions, but potentially, of a real-time interpolated view. The technology used to implement the system, the rationale for its selection, scalability issues, and potential future improvements, such as recording and offline playback, are discussed.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115393221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Demo: Distributed video coding applications in wireless multimedia sensor networks 演示:分布式视频编码在无线多媒体传感器网络中的应用
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042958
M. Jacobs, N. Deligiannis, Frederik Verbist, Jürgen Slowack, J. Barbarien, R. Walle, P. Schelkens, A. Munteanu
{"title":"Demo: Distributed video coding applications in wireless multimedia sensor networks","authors":"M. Jacobs, N. Deligiannis, Frederik Verbist, Jürgen Slowack, J. Barbarien, R. Walle, P. Schelkens, A. Munteanu","doi":"10.1109/ICDSC.2011.6042958","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042958","url":null,"abstract":"Novel distributed video coding (DVC) architectures developed by the IBBT DVC group realize state-of-the-art video coding efficiency under stringent energy restrictions, while supporting error-resilience and scalability. Therefore, these architectures are particularly attractive for application scenarios involving low-complexity energy-constrained wireless visual sensors. This demo presents the scenarios, which are considered to be the most promising areas of integration for IBBT's DVC systems, considering feasibility and commercial applicability.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115725370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Tracking across multiple cameras with overlapping views based on brightness and tangent transfer functions 基于亮度和正切传递函数的重叠视图跟踪多个摄像机
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042911
Chun-Te Chu, Jenq-Neng Hwang, Kung-Ming Lan, Shen-Zheng Wang
{"title":"Tracking across multiple cameras with overlapping views based on brightness and tangent transfer functions","authors":"Chun-Te Chu, Jenq-Neng Hwang, Kung-Ming Lan, Shen-Zheng Wang","doi":"10.1109/ICDSC.2011.6042911","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042911","url":null,"abstract":"The appearance of one object may be seen differently from distinct cameras with overlapping views due to the color deviation and perspective difference. In this paper, we study these problems and propose an appearance modeling technique in order to perform the tracking across the multiple cameras. For single camera tracking, an effective integrated Kalman filter and multiple kernels tracking scheme is adopted. When maneuvering the tracking across multiple cameras, we build the brightness transfer functions (BTFs) to compensate the color difference between camera views. The BTF is constructed from the overlapping area during tracking by employing robust principal component analysis (RPCA). Moreover, the perspective difference can also be compensated by applying the tangent transfer functions (TTFs) derived by the homography between two cameras. We evaluate the proposed method using several real-scenario videos and obtain the promising results.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117234122","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
PhD forum: Investigating the performance of a multi-modal approach to unusual event detection 博士论坛:研究异常事件检测的多模态方法的性能
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042954
J. Kuklyte, Philip Kelly, N. O’Connor
{"title":"PhD forum: Investigating the performance of a multi-modal approach to unusual event detection","authors":"J. Kuklyte, Philip Kelly, N. O’Connor","doi":"10.1109/ICDSC.2011.6042954","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042954","url":null,"abstract":"In this paper, we investigate the parameters underpinning our previously presented system for detecting unusual events in surveillance applications [1]. The system identifies anomalous events using an unsupervised data-driven approach. During a training period, typical activities within a surveilled environment are modeled using multi-modal sensor readings. Significant deviations from the established model of regular activity can then be flagged as anomalous at run-time. Using this approach, the system can be deployed and automatically adapt for use in any environment without any manual adjustment. Experiments carried out on two days of audio-visual data were performed and evaluated using a manually annotated ground-truth. We investigate sensor fusion and quantitatively evaluate the performance gains over single modality models. We also investigate different formulations of our cluster-based model of usual scenes as well as the impact of dynamic thresholding on identifying anomalous events. Experimental results are promising, even when modeling is performed using very simple audio and visual features.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127173959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Multi-camera tracking by joint calibration, association and fusion 联合标定、关联和融合的多摄像机跟踪
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042913
Siyue Chen, H. Leung
{"title":"Multi-camera tracking by joint calibration, association and fusion","authors":"Siyue Chen, H. Leung","doi":"10.1109/ICDSC.2011.6042913","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042913","url":null,"abstract":"To perform surveillance using multiple cameras, camera calibration, measurement-to-object association, fusion of measurements from multiple cameras are three essential components. While these three issues are usually addressed separately, they actually have mutual effects on each other. For example, calibration requires correctly associated objects and measurements with calibration errors will result in wrong associations. In this paper, we present a novel joint calibration, association and fusion approach for multi-camera tracking. More specifically, the expectation-maximization (EM) algorithm is incorporated with the extended Kalman filter (EKF) to give a simultaneous estimate of object states, calibration and association parameters. The real video data collected from two cameras are used to evaluate the tracking performance of the proposed method. Compared to the conventional methods, which perform calibration, association and fusion separately, it is shown that the proposed method can significantly improve the robustness and the accuracy of multi-object tracking.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127372670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Wi-FLIP: A wireless smart camera based on a focal-plane low-power image processor Wi-FLIP:一种基于焦平面低功耗图像处理器的无线智能相机
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042916
J. Fernández-Berni, R. Carmona-Galán, G. Cembrano, Á. Zarándy, Á. Rodríguez-Vázquez
{"title":"Wi-FLIP: A wireless smart camera based on a focal-plane low-power image processor","authors":"J. Fernández-Berni, R. Carmona-Galán, G. Cembrano, Á. Zarándy, Á. Rodríguez-Vázquez","doi":"10.1109/ICDSC.2011.6042916","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042916","url":null,"abstract":"This paper presents Wi-FLIP, a vision-enabled WSN node resulting from the integration of FLIP-Q, a prototype vision chip, and Imotel, a commercial WSN platform. In Wi-FLIP, image processing is not only constrained to the digital domain like in conventional architectures. Instead, its image sensor — the FLIP-Q prototype — incorporates pixel-level processing elements (PEs) implemented by analog circuitry. These PEs are interconnected, rendering a massively parallel SIMD-based focal-plane array. Low-level image processing tasks fit very well into this processing scheme. They feature a heavy computational load composed of pixel-wise repetitive operations which can be realized in parallel with moderate accuracy. In such circumstances, analog circuitry, not very precise but faster and more area- and power-efficient than its digital counterpart, has been extensively reported to achieve better performance. The Wi-FLIP's image sensor does not therefore output raw but pre-processed images that make the subsequent digital processing much lighter. The energy cost of such pre-processing is really low — 5.6mW for the worst-case scenario. As a result, for the configuration where the Imote2's processor works at minimum clock frequency, the maximum power consumed by our prototype represents only the 5.2% of the whole system power consumption. This percentage gets even lower as the clock frequency increases. We report experimental results for different algorithms, image resolutions and clock frequencies. The main drawback of this first version of Wi-FLIP is the low frame rate reachable due to the non-standard GPIO-based FLIPQ-to-Imote2 interface.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121999277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Human detection using mobile embedded smart cameras 利用移动嵌入式智能摄像头进行人体检测
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042924
Li He, Youlu Wang, Senem Velipasalar, M. C. Gursoy
{"title":"Human detection using mobile embedded smart cameras","authors":"Li He, Youlu Wang, Senem Velipasalar, M. C. Gursoy","doi":"10.1109/ICDSC.2011.6042924","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042924","url":null,"abstract":"Embedded smart cameras are stand-alone units that combine sensing and processing on a single embedded platform. They allow flexibility in camera placement, and provide mobility without being dependent on wired links. On the other hand, detecting and tracking objects from videos captured by mobile cameras is a very challenging task even on powerful computers. Since embedded smart cameras have very limited processing power, memory and energy, the challenge becomes even bigger. In this paper, we present a person detection system using an embedded smart camera mounted on a remote-controlled car. We employ histogram of oriented gradients (HOG) for detection, and present the performance results obtained on these resource-constrained environments. The example application is patrolling hallways in a building to detect people.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129047482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Identification of intruders in groups of people using cameras and RFIDs 使用摄像头和rfid识别人群中的入侵者
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042909
R. Cucchiara, Michele Fornaciari, R. Haider, F. Mandreoli, A. Prati
{"title":"Identification of intruders in groups of people using cameras and RFIDs","authors":"R. Cucchiara, Michele Fornaciari, R. Haider, F. Mandreoli, A. Prati","doi":"10.1109/ICDSC.2011.6042909","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042909","url":null,"abstract":"The identification of intruders in groups of people moving in wide open areas represents a challenging scenario where coordination between cameras can be certainly used but this solution is not enough. In this paper, we propose to go beyond pure vision-based approaches by integrating the use of distributed cameras with the RFID technology. To this end, we introduce a system that “maps” RFID tags to people detected by cameras by using sophisticated techniques to filter the singular modalities and an evidential fusion architecture, based on Transferable Belief Model, to combine the two sources of information and manage conflict between them. The conducted experimental evaluation shows very promising results, especially in treating groups of people.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"263 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130518351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
PhD forum: Multi-view occupancy maps using a network of low resolution visual sensors 博士论坛:使用低分辨率视觉传感器网络的多视角占用地图
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042951
Sebastian Gruenwedel, Vedran Jelaca, P. V. Hese, R. Kleihorst, W. Philips
{"title":"PhD forum: Multi-view occupancy maps using a network of low resolution visual sensors","authors":"Sebastian Gruenwedel, Vedran Jelaca, P. V. Hese, R. Kleihorst, W. Philips","doi":"10.1109/ICDSC.2011.6042951","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042951","url":null,"abstract":"An occupancy map provides an abstract top view of a scene and can be used for many applications such as domotics, surveillance, elderly-care and video teleconferencing. Such maps can be accurately estimated from multiple camera views. However, using a network of regular high resolution cameras makes the system expensive, and quickly raises privacy concerns (e.g. in elderly homes). Furthermore, their power consumption makes battery operation difficult. A solution could be the use of a network of low resolution visual sensors, but their limited resolution could degrade the accuracy of the maps. In this paper we used simulations to determine the minimum required resolution needed for deriving accurate occupancy maps which were then used to track people. Multi-view occupancy maps were computed from foreground silhouettes derived via an analysis of moving edges. Ground occupancies computed from each view were fused in a Dempster-Shafer framework. Tracking was done via a Bayes filter using the occupancy map per time instance as measurement. We found that for a room of 8.8 by 9.2 m, 4 cameras with a resolution as low as 64 by 48 pixels was sufficient to estimate accurate occupancy maps and track up to 4 people. These findings indicate that it is possible to use low resolution visual sensors to build a cheap, power efficient and privacy-friendly system for occupancy monitoring.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127783031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Distributed three-dimensional camera alignment in highly-dynamical prioritized observation areas 高动态优先观测区域的分布式三维相机对准
2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras Pub Date : 2011-10-13 DOI: 10.1109/ICDSC.2011.6042904
Uwe Jänen, Matthias Huy, Carsten Grenz, J. Hähner, Martin Hoffmann
{"title":"Distributed three-dimensional camera alignment in highly-dynamical prioritized observation areas","authors":"Uwe Jänen, Matthias Huy, Carsten Grenz, J. Hähner, Martin Hoffmann","doi":"10.1109/ICDSC.2011.6042904","DOIUrl":"https://doi.org/10.1109/ICDSC.2011.6042904","url":null,"abstract":"Video surveillance of large areas has become a necessary safety procedure. With an increase of these areas and an accompanying increasing number of cameras, manual configuration becomes infeasible. This paper describes a way to automate this task by distributing it among the smart camera nodes. The main objective is an overlap-free monitoring of the observation area, considering the distinct priority of an area element. Due to a time-dependable priority the observation quality will increase in contrast to static priorities. The cameras only use local knowledge and single-hop communication. Each camera optimizes its local field-of-view according to a specified quality function. It is shown that the developed algorithm scales well to large smart-camera systems.","PeriodicalId":385052,"journal":{"name":"2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130216583","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信