Eliminating the Repetitive Motions as a Preprocessing step for Fast Human Action Retrieval

Mohsen Ramezani, F. Yaghmaee
{"title":"Eliminating the Repetitive Motions as a Preprocessing step for Fast Human Action Retrieval","authors":"Mohsen Ramezani, F. Yaghmaee","doi":"10.1109/ICCKE48569.2019.8965087","DOIUrl":null,"url":null,"abstract":"Today, video searching methods dropped behind the growth of using capturing devices. Action retrieval is a new research field which seeks to use the captured human action for searching the videos. As most human actions consist of similar motions which are repeated over time, we seek to propose a method for eliminating the repetitive motions before retrieving the videos. This method, as a preprocessing step, can decrease the volume of the retrieval computations for each video. Here, a function is used to calculate a value per each pixel as its movement energy. Then, CWT (Continuous Wavelet Transform) is used for mapping the response function of the points into the frequency space to find similar motion patterns more easier. The DTW (Dynamic Time Wrapping) is then applied on the new space to find similar frequency patterns (episodes) over time. Finally, one of the similar episodes, i.e. some sequential frames, remains for the retrieval computations and others are eliminated. The proposed method is evaluated on KTH, UCFYT, and HMDB datasets and results indicate the proper performance of the proposed method. Eliminating the repetitive motions results into significant reduction in retrieval computations and time.","PeriodicalId":6685,"journal":{"name":"2019 9th International Conference on Computer and Knowledge Engineering (ICCKE)","volume":"104 1","pages":"26-31"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 9th International Conference on Computer and Knowledge Engineering (ICCKE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCKE48569.2019.8965087","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Today, video searching methods dropped behind the growth of using capturing devices. Action retrieval is a new research field which seeks to use the captured human action for searching the videos. As most human actions consist of similar motions which are repeated over time, we seek to propose a method for eliminating the repetitive motions before retrieving the videos. This method, as a preprocessing step, can decrease the volume of the retrieval computations for each video. Here, a function is used to calculate a value per each pixel as its movement energy. Then, CWT (Continuous Wavelet Transform) is used for mapping the response function of the points into the frequency space to find similar motion patterns more easier. The DTW (Dynamic Time Wrapping) is then applied on the new space to find similar frequency patterns (episodes) over time. Finally, one of the similar episodes, i.e. some sequential frames, remains for the retrieval computations and others are eliminated. The proposed method is evaluated on KTH, UCFYT, and HMDB datasets and results indicate the proper performance of the proposed method. Eliminating the repetitive motions results into significant reduction in retrieval computations and time.
消除重复动作作为快速人体动作检索的预处理步骤
如今,视频搜索方法落后于使用捕捉设备的增长。动作检索是一个新的研究领域,旨在利用捕捉到的人类动作来搜索视频。由于大多数人类行为由相似的动作组成,这些动作随着时间的推移而重复,我们试图提出一种在检索视频之前消除重复动作的方法。该方法作为预处理步骤,可以减少每个视频的检索计算量。这里,使用一个函数来计算每个像素的移动能量值。然后,使用连续小波变换(CWT)将点的响应函数映射到频率空间中,更容易找到相似的运动模式。然后将DTW(动态时间包裹)应用于新空间,以找到随时间变化的相似频率模式(剧集)。最后,其中一个相似的情节,即一些连续帧,保留用于检索计算,而其他的被消除。在KTH、UCFYT和HMDB数据集上对该方法进行了评估,结果表明该方法具有良好的性能。消除重复运动可以显著减少检索计算和时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信