这个bug修复变更会破坏回归测试吗?

Xinye Tang, Song Wang, Ke Mao
{"title":"这个bug修复变更会破坏回归测试吗?","authors":"Xinye Tang, Song Wang, Ke Mao","doi":"10.1109/ESEM.2015.7321218","DOIUrl":null,"url":null,"abstract":"Context: Software source code is frequently changed for fixing revealed bugs. These bug-fixing changes might introduce unintended system behaviors, which are inconsistent with scenarios of existing regression test cases, and consequently break regression testing. For validating the quality of changes, regression testing is a required process before submitting changes during the development of software projects. Our pilot study shows that 48.7% bug-fixing changes might break regression testing at first run, which means developers have to run regression testing at least a couple of times for 48.7% changes. Such process can be tedious and time consuming. Thus, before running regression test suite, finding these changes and corresponding regression test cases could be helpful for developers to quickly fix these changes and improve the efficiency of regression testing. Goal: This paper proposes bug- fixing change impact prediction (BFCP), for predicting whether a bug-fixing change will break regression testing or not before running regression test cases, by mining software change histories. Method: Our approach employs the machine learning algorithms and static call graph analysis technique. Given a bug-fixing change, BFCP first predicts whether it will break existing regression test cases; second, if the change is predicted to break regression test cases, BFCP can further identify the might-be-broken test cases. Results: Results of experiments on 552 real bug-fixing changes from four large open source projects show that BFCP could achieve prediction precision up to 83.3%, recall up to 92.3%, and F-score up to 81.4%. For identifying the might-be-broken test cases, BFCP could achieve 100% recall.","PeriodicalId":258843,"journal":{"name":"2015 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Will This Bug-Fixing Change Break Regression Testing?\",\"authors\":\"Xinye Tang, Song Wang, Ke Mao\",\"doi\":\"10.1109/ESEM.2015.7321218\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Context: Software source code is frequently changed for fixing revealed bugs. These bug-fixing changes might introduce unintended system behaviors, which are inconsistent with scenarios of existing regression test cases, and consequently break regression testing. For validating the quality of changes, regression testing is a required process before submitting changes during the development of software projects. Our pilot study shows that 48.7% bug-fixing changes might break regression testing at first run, which means developers have to run regression testing at least a couple of times for 48.7% changes. Such process can be tedious and time consuming. Thus, before running regression test suite, finding these changes and corresponding regression test cases could be helpful for developers to quickly fix these changes and improve the efficiency of regression testing. Goal: This paper proposes bug- fixing change impact prediction (BFCP), for predicting whether a bug-fixing change will break regression testing or not before running regression test cases, by mining software change histories. Method: Our approach employs the machine learning algorithms and static call graph analysis technique. Given a bug-fixing change, BFCP first predicts whether it will break existing regression test cases; second, if the change is predicted to break regression test cases, BFCP can further identify the might-be-broken test cases. Results: Results of experiments on 552 real bug-fixing changes from four large open source projects show that BFCP could achieve prediction precision up to 83.3%, recall up to 92.3%, and F-score up to 81.4%. For identifying the might-be-broken test cases, BFCP could achieve 100% recall.\",\"PeriodicalId\":258843,\"journal\":{\"name\":\"2015 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)\",\"volume\":\"49 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ESEM.2015.7321218\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ESEM.2015.7321218","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

摘要

背景:经常更改软件源代码以修复已发现的错误。这些修复错误的更改可能会引入意想不到的系统行为,这些行为与现有回归测试用例的场景不一致,从而破坏回归测试。为了验证变更的质量,在软件项目开发期间提交变更之前,回归测试是必需的过程。我们的初步研究表明,48.7%的bug修复更改可能会在第一次运行时破坏回归测试,这意味着开发人员必须为48.7%的更改运行至少几次回归测试。这样的过程可能是乏味和耗时的。因此,在运行回归测试套件之前,找到这些更改和相应的回归测试用例可以帮助开发人员快速修复这些更改并提高回归测试的效率。目标:本文提出bug修复变更影响预测(BFCP),通过挖掘软件变更历史,在运行回归测试用例之前预测bug修复变更是否会破坏回归测试。方法:我们的方法使用机器学习算法和静态调用图分析技术。给定一个bug修复更改,BFCP首先预测它是否会破坏现有的回归测试用例;第二,如果预测变更会破坏回归测试用例,BFCP可以进一步识别可能被破坏的测试用例。结果:对来自4个大型开源项目的552个真实bug修复变更的实验结果表明,BFCP的预测精度高达83.3%,召回率高达92.3%,F-score高达81.4%。为了识别可能被破坏的测试用例,BFCP可以实现100%的召回。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Will This Bug-Fixing Change Break Regression Testing?
Context: Software source code is frequently changed for fixing revealed bugs. These bug-fixing changes might introduce unintended system behaviors, which are inconsistent with scenarios of existing regression test cases, and consequently break regression testing. For validating the quality of changes, regression testing is a required process before submitting changes during the development of software projects. Our pilot study shows that 48.7% bug-fixing changes might break regression testing at first run, which means developers have to run regression testing at least a couple of times for 48.7% changes. Such process can be tedious and time consuming. Thus, before running regression test suite, finding these changes and corresponding regression test cases could be helpful for developers to quickly fix these changes and improve the efficiency of regression testing. Goal: This paper proposes bug- fixing change impact prediction (BFCP), for predicting whether a bug-fixing change will break regression testing or not before running regression test cases, by mining software change histories. Method: Our approach employs the machine learning algorithms and static call graph analysis technique. Given a bug-fixing change, BFCP first predicts whether it will break existing regression test cases; second, if the change is predicted to break regression test cases, BFCP can further identify the might-be-broken test cases. Results: Results of experiments on 552 real bug-fixing changes from four large open source projects show that BFCP could achieve prediction precision up to 83.3%, recall up to 92.3%, and F-score up to 81.4%. For identifying the might-be-broken test cases, BFCP could achieve 100% recall.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信