Evaluating guidelines for empirical software engineering studies

B. Kitchenham, H. Al-Kilidar, M. Babar, Michael Berry, Karl Cox, J. Keung, F. Kurniawati, M. Staples, He Zhang, Liming Zhu
{"title":"Evaluating guidelines for empirical software engineering studies","authors":"B. Kitchenham, H. Al-Kilidar, M. Babar, Michael Berry, Karl Cox, J. Keung, F. Kurniawati, M. Staples, He Zhang, Liming Zhu","doi":"10.1145/1159733.1159742","DOIUrl":null,"url":null,"abstract":"Background. Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Andreas Jedlitschka and Dietmar Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. If guidelines are flawed, they will cause more problems that they solve.Aim. The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed.Method. We used perspective-based inspections to perform a theoretical evaluation of the guidelines. A separate inspection was performed for each perspective. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the inspections were based on a set of questions derived by brainstorming. The inspection using the Author perspective reviewed each section of the guidelines sequentially. Results. The question-based perspective inspections detected 42 issues where the guidelines would benefit from amendment or clarification and 8 defects.Conclusions. Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. Software engineering researchers need to be cautious about adopting reporting guidelines that differ from those used by other disciplines. The current guidelines need to be revised and the revised guidelines need to be subjected to further theoretical and empirical validation. Perspective-based inspection is a useful validation method but the practitioner/consultant perspective presents difficulties.","PeriodicalId":201305,"journal":{"name":"International Symposium on Empirical Software Engineering","volume":"110 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"51","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Empirical Software Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1159733.1159742","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 51

Abstract

Background. Several researchers have criticized the standards of performing and reporting empirical studies in software engineering. In order to address this problem, Andreas Jedlitschka and Dietmar Pfahl have produced reporting guidelines for controlled experiments in software engineering. They pointed out that their guidelines needed evaluation. We agree that guidelines need to be evaluated before they can be widely adopted. If guidelines are flawed, they will cause more problems that they solve.Aim. The aim of this paper is to present the method we used to evaluate the guidelines and report the results of our evaluation exercise. We suggest our evaluation process may be of more general use if reporting guidelines for other types of empirical study are developed.Method. We used perspective-based inspections to perform a theoretical evaluation of the guidelines. A separate inspection was performed for each perspective. The perspectives used were: Researcher, Practitioner/Consultant, Meta-analyst, Replicator, Reviewer and Author. Apart from the Author perspective, the inspections were based on a set of questions derived by brainstorming. The inspection using the Author perspective reviewed each section of the guidelines sequentially. Results. The question-based perspective inspections detected 42 issues where the guidelines would benefit from amendment or clarification and 8 defects.Conclusions. Reporting guidelines need to specify what information goes into what section and avoid excessive duplication. Software engineering researchers need to be cautious about adopting reporting guidelines that differ from those used by other disciplines. The current guidelines need to be revised and the revised guidelines need to be subjected to further theoretical and empirical validation. Perspective-based inspection is a useful validation method but the practitioner/consultant perspective presents difficulties.
评估经验软件工程研究的指导方针
背景。一些研究人员批评了软件工程中执行和报告经验研究的标准。为了解决这个问题,Andreas Jedlitschka和Dietmar Pfahl为软件工程中的受控实验编写了报告指南。他们指出,他们的指导方针需要评估。我们同意,需要对指导方针进行评估,然后才能广泛采用。如果指导方针是有缺陷的,那么它们造成的问题将比它们解决的问题更多。本文的目的是介绍我们用来评估指南的方法,并报告我们评估工作的结果。我们建议,如果为其他类型的实证研究制定报告指南,我们的评估过程可能会更普遍地使用。我们使用基于视角的检查来执行指导方针的理论评估。对每个透视图执行单独的检查。使用的视角是:研究者、从业者/顾问、元分析者、复制者、审稿人和作者。除了作者的观点之外,检查是基于头脑风暴产生的一系列问题。使用Author透视图的检查顺序地检查了指南的每个部分。结果。基于问题的视角检查发现了42个指南需要修改或澄清的地方,以及8个缺陷。报告指南需要指定哪些信息应归入哪些部分,并避免过度重复。软件工程研究人员在采用不同于其他学科使用的报告指南时需要谨慎。目前的指南需要修订,修订后的指南需要进一步的理论和实证验证。基于视角的检查是一种有用的验证方法,但从业者/咨询师的视角存在困难。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信