Spatio-Temporal Interest Points Chain (STIPC) for activity recognition

Fei Yuan, Gui-Song Xia, H. Sahbi, V. Prinet
{"title":"Spatio-Temporal Interest Points Chain (STIPC) for activity recognition","authors":"Fei Yuan, Gui-Song Xia, H. Sahbi, V. Prinet","doi":"10.1109/ACPR.2011.6166581","DOIUrl":null,"url":null,"abstract":"We present a novel feature, named Spatio-Temporal Interest Points Chain (STIPC), for activity representation and recognition. This new feature consists of a set of trackable spatio-temporal interest points, which correspond to a series of discontinuous motion among a long-term motion of an object or its part. By this chain feature, we can not only capture the discriminative motion information which space-time interest point-like feature try to pursue, but also build the connection between them. Specifically, we first extract the point trajectories from the image sequences, then partition the points on each trajectory into two kinds of different yet close related points: discontinuous motion points and continuous motion points. We extract local space-time features around discontinuous motion points and use a chain model to represent them. Furthermore, we introduce a chain descriptor to encode the temporal relationships between these interdependent local space-time features. The experimental results on challenging datasets show that our STIPC features improves local space-time features and achieve state-of-the-art results.","PeriodicalId":287232,"journal":{"name":"The First Asian Conference on Pattern Recognition","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The First Asian Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2011.6166581","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

We present a novel feature, named Spatio-Temporal Interest Points Chain (STIPC), for activity representation and recognition. This new feature consists of a set of trackable spatio-temporal interest points, which correspond to a series of discontinuous motion among a long-term motion of an object or its part. By this chain feature, we can not only capture the discriminative motion information which space-time interest point-like feature try to pursue, but also build the connection between them. Specifically, we first extract the point trajectories from the image sequences, then partition the points on each trajectory into two kinds of different yet close related points: discontinuous motion points and continuous motion points. We extract local space-time features around discontinuous motion points and use a chain model to represent them. Furthermore, we introduce a chain descriptor to encode the temporal relationships between these interdependent local space-time features. The experimental results on challenging datasets show that our STIPC features improves local space-time features and achieve state-of-the-art results.
基于时空兴趣点链(STIPC)的活动识别
我们提出了一种新的特征,称为时空兴趣点链(STIPC),用于活动表示和识别。这一新特征由一组可跟踪的时空兴趣点组成,这些兴趣点对应于一个物体或其部分的长期运动中的一系列不连续运动。通过这种链式特征,我们不仅可以捕捉到时空兴趣点特征所追求的鉴别运动信息,而且可以建立它们之间的联系。具体来说,我们首先从图像序列中提取点轨迹,然后将每个轨迹上的点划分为两种不同但密切相关的点:不连续运动点和连续运动点。我们提取了不连续运动点周围的局部时空特征,并使用链模型来表示它们。此外,我们引入了一个链描述符来编码这些相互依赖的局部时空特征之间的时间关系。在具有挑战性的数据集上的实验结果表明,我们的STIPC特征改进了局部时空特征,达到了最先进的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信