Non-parametric Background Generation based on MRF Framework

Sang-Hyun Cho, Hang-Bong Kang
{"title":"Non-parametric Background Generation based on MRF Framework","authors":"Sang-Hyun Cho, Hang-Bong Kang","doi":"10.3745/KIPSTB.2010.17B.6.405","DOIUrl":null,"url":null,"abstract":"ABSTRACT Previous background generation techniques showed bad performance in complex environments since they used only temporal contexts. To overcome this problem, in this paper, we propose a new background generation method which incorporates spatial as well as temporal contexts of the image. This enabled us to obtain ‘clean’ background image with no moving objects. In our proposed method, first we divided the sampled frame into m*n blocks in the video sequence and classified each block as either static or non-static. For blocks which are classified as non-static, we used MRF framework to model them in temporal and spatial contexts. MRF framework provides a convenient and consistent way of modeling context-dependent entities such as image pixels and correlated features. Experimental results show that our proposed method is more efficient than the traditional one.Keywords:Background Generation, Background Model, Surveillance System, MRF Framework, Object Tracking 1. 서 론 1) 비디오 감시 시스템이나 모니터링 시스템과 같이 다양한 컴퓨터 비전 응용분야에서 물체 검출과 추적은 매우 중요한 요소로서 현재도 활발한 연구가 이루어지고 있는 분야이다. 많은 검출 및 추적 시스템에서 움직이는 물체가 포함되어 있지 않은 배경 영상은 물체 검출 및 추적을 위한 참조 정보로서 이용된다. 하지만 대부분의 경우, 이러한 배경 영상을 획득하는 것은 어려운 일이다.","PeriodicalId":122700,"journal":{"name":"The Kips Transactions:partb","volume":"9 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Kips Transactions:partb","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3745/KIPSTB.2010.17B.6.405","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

ABSTRACT Previous background generation techniques showed bad performance in complex environments since they used only temporal contexts. To overcome this problem, in this paper, we propose a new background generation method which incorporates spatial as well as temporal contexts of the image. This enabled us to obtain ‘clean’ background image with no moving objects. In our proposed method, first we divided the sampled frame into m*n blocks in the video sequence and classified each block as either static or non-static. For blocks which are classified as non-static, we used MRF framework to model them in temporal and spatial contexts. MRF framework provides a convenient and consistent way of modeling context-dependent entities such as image pixels and correlated features. Experimental results show that our proposed method is more efficient than the traditional one.Keywords:Background Generation, Background Model, Surveillance System, MRF Framework, Object Tracking 1. 서 론 1) 비디오 감시 시스템이나 모니터링 시스템과 같이 다양한 컴퓨터 비전 응용분야에서 물체 검출과 추적은 매우 중요한 요소로서 현재도 활발한 연구가 이루어지고 있는 분야이다. 많은 검출 및 추적 시스템에서 움직이는 물체가 포함되어 있지 않은 배경 영상은 물체 검출 및 추적을 위한 참조 정보로서 이용된다. 하지만 대부분의 경우, 이러한 배경 영상을 획득하는 것은 어려운 일이다.
基于MRF框架的非参数背景生成
以前的背景生成技术在复杂的环境中表现不佳,因为它们只使用时间上下文。为了克服这一问题,本文提出了一种新的背景生成方法,该方法结合了图像的空间和时间背景。这使我们能够获得没有移动物体的“干净”背景图像。在我们提出的方法中,我们首先将采样帧在视频序列中分成m*n个块,并将每个块分为静态和非静态。对于被分类为非静态的块,我们使用MRF框架在时间和空间背景下对它们进行建模。MRF框架为图像像素和相关特征等上下文相关实体的建模提供了方便和一致的方法。实验结果表明,该方法比传统方法更有效。关键词:背景生成,背景模型,监控系统,MRF框架,目标跟踪서론1)비디오감시시스템이나모니터링시스템과같이다양한컴퓨터비전응용분야에서물체검출과추적은매우중요한요소로서현재도활발한연구가이루어지고있는분야이다。많은검출및추적시스템에서움직이는물체가포함되어있지않은배경영상은물체검출및추적을위한참조정보로서이용된다。■■■■■■■■■■■■■■■■■■■■■■■■
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信