基于关联 MCMC 模型的篮球轨迹实时跟踪

Yong Gong, Gautam Srivastava
{"title":"基于关联 MCMC 模型的篮球轨迹实时跟踪","authors":"Yong Gong, Gautam Srivastava","doi":"10.1007/s11036-024-02358-0","DOIUrl":null,"url":null,"abstract":"<p>In basketball videos, the trajectories of a basketball changes rapidly. Since the visual features changes in a more homogeneous region, the frame difference method is a suitable basis for trajectory real-time tracking. However, traditional methods need a huge number of iterative calculations in a random image to find spatial feature differences to segment the basketball from to frame, resulting in tracking lag. Therefore, a real-time tracking method of basketball trajectory is designed based on an associative Markov Chain Monte Carlo (MCMC) model. From pixel illumination differences between two adjacent frames in basketball game videos, the basketball’s movement is determined, and the foreground and background of the basketball frame are separated. Then, coordinates of the basketball are detected by a Convolutional Neural Network (CNN), and the change of coordinates is used to construct a visual 2D mapping model, which calculates both angular and linear acceleration of the basketball. To solve the interaction problem of randomness and spatial variability, an associative MCMC model is designed to segment basketball images with simple conditions, and a Bayesian network is established to input parameters of the segmented basketball movement for the determination of trajectory deviation. Finally, basketball movement trends are calculated to achieve real-time tracking of the trajectory in the basketball video. The experimental results show that compared with the original running path, this method has the smallest difference in tracking trajectory error, and the estimation error does not exceed 0.2 when the false alarm rate is 100. The trajectory tracking time is always less than 2.2 seconds, indicating that it has good trajectory tracking ability.</p>","PeriodicalId":501103,"journal":{"name":"Mobile Networks and Applications","volume":"20 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Real-Time Tracking of Basketball Trajectory Based on the Associative MCMC Model\",\"authors\":\"Yong Gong, Gautam Srivastava\",\"doi\":\"10.1007/s11036-024-02358-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In basketball videos, the trajectories of a basketball changes rapidly. Since the visual features changes in a more homogeneous region, the frame difference method is a suitable basis for trajectory real-time tracking. However, traditional methods need a huge number of iterative calculations in a random image to find spatial feature differences to segment the basketball from to frame, resulting in tracking lag. Therefore, a real-time tracking method of basketball trajectory is designed based on an associative Markov Chain Monte Carlo (MCMC) model. From pixel illumination differences between two adjacent frames in basketball game videos, the basketball’s movement is determined, and the foreground and background of the basketball frame are separated. Then, coordinates of the basketball are detected by a Convolutional Neural Network (CNN), and the change of coordinates is used to construct a visual 2D mapping model, which calculates both angular and linear acceleration of the basketball. To solve the interaction problem of randomness and spatial variability, an associative MCMC model is designed to segment basketball images with simple conditions, and a Bayesian network is established to input parameters of the segmented basketball movement for the determination of trajectory deviation. Finally, basketball movement trends are calculated to achieve real-time tracking of the trajectory in the basketball video. The experimental results show that compared with the original running path, this method has the smallest difference in tracking trajectory error, and the estimation error does not exceed 0.2 when the false alarm rate is 100. The trajectory tracking time is always less than 2.2 seconds, indicating that it has good trajectory tracking ability.</p>\",\"PeriodicalId\":501103,\"journal\":{\"name\":\"Mobile Networks and Applications\",\"volume\":\"20 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Mobile Networks and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s11036-024-02358-0\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mobile Networks and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11036-024-02358-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在篮球视频中,篮球的轨迹变化很快。由于视觉特征变化的区域较为均匀,因此帧差法适合作为轨迹实时跟踪的基础。然而,传统方法需要在随机图像中进行大量迭代计算,才能找到空间特征差异,从而分割出每帧的篮球,导致跟踪滞后。因此,本文设计了一种基于关联马尔可夫链蒙特卡洛(MCMC)模型的篮球轨迹实时跟踪方法。根据篮球比赛视频中相邻两帧之间的像素照度差异,确定篮球的运动轨迹,并分离篮球帧的前景和背景。然后,通过卷积神经网络(CNN)检测篮球的坐标,并利用坐标的变化构建视觉二维映射模型,计算篮球的角加速度和线加速度。为了解决随机性和空间可变性的交互问题,设计了一个关联 MCMC 模型来分割条件简单的篮球图像,并建立了一个贝叶斯网络来输入分割后的篮球运动参数,以确定轨迹偏差。最后,计算篮球运动趋势,实现对篮球视频轨迹的实时跟踪。实验结果表明,与原始运行路径相比,该方法跟踪轨迹误差差异最小,误报率为 100 时估计误差不超过 0.2。轨迹跟踪时间始终小于 2.2 秒,表明该方法具有良好的轨迹跟踪能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Real-Time Tracking of Basketball Trajectory Based on the Associative MCMC Model

Real-Time Tracking of Basketball Trajectory Based on the Associative MCMC Model

In basketball videos, the trajectories of a basketball changes rapidly. Since the visual features changes in a more homogeneous region, the frame difference method is a suitable basis for trajectory real-time tracking. However, traditional methods need a huge number of iterative calculations in a random image to find spatial feature differences to segment the basketball from to frame, resulting in tracking lag. Therefore, a real-time tracking method of basketball trajectory is designed based on an associative Markov Chain Monte Carlo (MCMC) model. From pixel illumination differences between two adjacent frames in basketball game videos, the basketball’s movement is determined, and the foreground and background of the basketball frame are separated. Then, coordinates of the basketball are detected by a Convolutional Neural Network (CNN), and the change of coordinates is used to construct a visual 2D mapping model, which calculates both angular and linear acceleration of the basketball. To solve the interaction problem of randomness and spatial variability, an associative MCMC model is designed to segment basketball images with simple conditions, and a Bayesian network is established to input parameters of the segmented basketball movement for the determination of trajectory deviation. Finally, basketball movement trends are calculated to achieve real-time tracking of the trajectory in the basketball video. The experimental results show that compared with the original running path, this method has the smallest difference in tracking trajectory error, and the estimation error does not exceed 0.2 when the false alarm rate is 100. The trajectory tracking time is always less than 2.2 seconds, indicating that it has good trajectory tracking ability.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信