Video capture and post-processing technique for approximating 3D projectile trajectory

Chase M. Pfeifer, J. Burnfield, G. Cesar, Max H. Twedt, Jeff A. Hawks
{"title":"Video capture and post-processing technique for approximating 3D projectile trajectory","authors":"Chase M. Pfeifer, J. Burnfield, G. Cesar, Max H. Twedt, Jeff A. Hawks","doi":"10.1080/19346182.2016.1248974","DOIUrl":null,"url":null,"abstract":"Abstract In this paper we introduce a low-cost procedure and methodology for markerless projectile tracking in three-dimensional (3D) space. Understanding the 3D trajectory of an object in flight can often be essential in examining variables relating to launch and landing conditions. Many systems exist to track the 3D motion of projectiles but are often constrained by space or the type of object the system can recognize (Qualisys, Göteborg, Sweden; Vicon, Oxford, United Kingdom; OptiTrack, Corvallis, Oregon USA; Motion Analysis, Santa Rosa, California USA; Flight Scope, Orlando, Florida USA). These technologies can also be quite expensive, often costing hundreds of thousand dollars. The system presented in this paper utilizes two high-definition video cameras oriented perpendicular to each other to record the flight of an object. A post-processing technique and subsequent geometrically based algorithm was created to determine 3D position of the object using the two videos. This procedure and methodology was validated using a gold standard motion tracking system resulting in a 4.5 ± 1.8% deviation from the gold standard.","PeriodicalId":237335,"journal":{"name":"Sports Technology","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sports Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19346182.2016.1248974","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Abstract In this paper we introduce a low-cost procedure and methodology for markerless projectile tracking in three-dimensional (3D) space. Understanding the 3D trajectory of an object in flight can often be essential in examining variables relating to launch and landing conditions. Many systems exist to track the 3D motion of projectiles but are often constrained by space or the type of object the system can recognize (Qualisys, Göteborg, Sweden; Vicon, Oxford, United Kingdom; OptiTrack, Corvallis, Oregon USA; Motion Analysis, Santa Rosa, California USA; Flight Scope, Orlando, Florida USA). These technologies can also be quite expensive, often costing hundreds of thousand dollars. The system presented in this paper utilizes two high-definition video cameras oriented perpendicular to each other to record the flight of an object. A post-processing technique and subsequent geometrically based algorithm was created to determine 3D position of the object using the two videos. This procedure and methodology was validated using a gold standard motion tracking system resulting in a 4.5 ± 1.8% deviation from the gold standard.
三维弹丸轨迹逼近的视频采集与后处理技术
本文介绍了一种低成本的三维空间无标记弹丸跟踪方法。了解飞行中物体的3D轨迹对于检查与发射和着陆条件相关的变量通常是必不可少的。许多系统可以跟踪投射物的三维运动,但往往受到空间或系统可以识别的物体类型的限制(Qualisys, Göteborg,瑞典;英国牛津的Vicon;OptiTrack,美国俄勒冈州科瓦利斯;美国加州圣罗莎运动分析公司;飞行范围,奥兰多,美国佛罗里达州)。这些技术也可能相当昂贵,通常要花费数十万美元。本文提出的系统利用两台相互垂直的高清摄像机来记录物体的飞行过程。创建了一种后处理技术和随后的基于几何的算法来确定使用两个视频的物体的3D位置。该程序和方法使用金标准运动跟踪系统进行验证,结果与金标准偏差为4.5±1.8%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信