Searching Bug Instances in Gameplay Video Repositories

IF 1.7 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Mohammad Reza Taesiri;Finlay Macklon;Sarra Habchi;Cor-Paul Bezemer
{"title":"Searching Bug Instances in Gameplay Video Repositories","authors":"Mohammad Reza Taesiri;Finlay Macklon;Sarra Habchi;Cor-Paul Bezemer","doi":"10.1109/TG.2024.3355285","DOIUrl":null,"url":null,"abstract":"Gameplay videos offer valuable insights into player interactions and game responses, particularly data about game bugs. Despite the abundance of gameplay videos online, extracting useful information remains a challenge. This article introduces a method for searching and extracting relevant videos from extensive video repositories using English text queries. Our approach requires no external information, like video metadata; it solely depends on video content. Leveraging the zero-shot transfer capabilities of the contrastive language–image pretraining model, our approach does not require any data labeling or training. To evaluate our approach, we present the \n<monospace>GamePhysics</monospace>\n dataset, comprising 26 954 videos from 1873 games that were collected from the \n<uri>GamePhysics</uri>\n section on the Reddit website. Our approach shows promising results in our extensive analysis of simple and compound queries, indicating that our method is useful for detecting objects and events in gameplay videos. Moreover, we assess the effectiveness of our method by analyzing a carefully annotated dataset of 220 gameplay videos. The results of our study demonstrate the potential of our approach for applications, such as the creation of a video search tool tailored to identifying video game bugs, which could greatly benefit quality assurance teams in finding and reproducing bugs.","PeriodicalId":55977,"journal":{"name":"IEEE Transactions on Games","volume":"16 3","pages":"697-710"},"PeriodicalIF":1.7000,"publicationDate":"2024-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Games","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10402100/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Gameplay videos offer valuable insights into player interactions and game responses, particularly data about game bugs. Despite the abundance of gameplay videos online, extracting useful information remains a challenge. This article introduces a method for searching and extracting relevant videos from extensive video repositories using English text queries. Our approach requires no external information, like video metadata; it solely depends on video content. Leveraging the zero-shot transfer capabilities of the contrastive language–image pretraining model, our approach does not require any data labeling or training. To evaluate our approach, we present the GamePhysics dataset, comprising 26 954 videos from 1873 games that were collected from the GamePhysics section on the Reddit website. Our approach shows promising results in our extensive analysis of simple and compound queries, indicating that our method is useful for detecting objects and events in gameplay videos. Moreover, we assess the effectiveness of our method by analyzing a carefully annotated dataset of 220 gameplay videos. The results of our study demonstrate the potential of our approach for applications, such as the creation of a video search tool tailored to identifying video game bugs, which could greatly benefit quality assurance teams in finding and reproducing bugs.
在游戏视频库中搜索错误实例
游戏视频为玩家互动和游戏反应提供了宝贵的见解,尤其是有关游戏错误的数据。尽管网上有大量游戏视频,但提取有用信息仍是一项挑战。本文介绍了一种使用英文文本查询从大量视频库中搜索和提取相关视频的方法。我们的方法不需要视频元数据等外部信息,只依赖于视频内容。利用对比语言-图像预训练模型的零镜头转移功能,我们的方法不需要任何数据标记或训练。为了评估我们的方法,我们展示了 GamePhysics 数据集,其中包括从 Reddit 网站 GamePhysics 部分收集的来自 1873 款游戏的 26 954 个视频。在对简单和复合查询的广泛分析中,我们的方法显示出了良好的效果,表明我们的方法有助于检测游戏视频中的对象和事件。此外,我们还通过分析精心注释的 220 个游戏视频数据集来评估我们方法的有效性。我们的研究结果证明了我们的方法在应用方面的潜力,例如创建一个专门用于识别视频游戏错误的视频搜索工具,这将极大地有利于质量保证团队发现和复制错误。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Games
IEEE Transactions on Games Engineering-Electrical and Electronic Engineering
CiteScore
4.60
自引率
8.70%
发文量
87
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信