Human recoverability index: A TraceLab experiment

Alex Dekhtyar, Michael C Hilton
{"title":"Human recoverability index: A TraceLab experiment","authors":"Alex Dekhtyar, Michael C Hilton","doi":"10.1109/TEFSE.2013.6620152","DOIUrl":null,"url":null,"abstract":"It has been generally accepted that not all trace links in a given requirements traceability matrix are equal - both human analysts and automated methods are good at spotting some links, but have blind spots for some other. One way to choose automated techniques for inclusion in assisted tracing processes (i.e., the tracing processes that combine the expertise of a human analyst and special-purpose tracing software) is to select the techniques that tend to discover more links that are hard for human analysts to observe and establish on their own. This paper proposes a new measure of performance of a tracing method: human recoverability index-based recall. In the presence of knowledge about the difficulty of link recovery by human analysts, this measure rewards methods that are able to recover such links over methods that tend to recover the same links as the human analysts. We describe a TraceLab experiment we designed to evaluate automated trace recovery methods based on this measure and provide a case study of the use of this experiment to profile and evaluate different automated tracing techniques.","PeriodicalId":330587,"journal":{"name":"2013 7th International Workshop on Traceability in Emerging Forms of Software Engineering (TEFSE)","volume":"401 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 7th International Workshop on Traceability in Emerging Forms of Software Engineering (TEFSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TEFSE.2013.6620152","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

It has been generally accepted that not all trace links in a given requirements traceability matrix are equal - both human analysts and automated methods are good at spotting some links, but have blind spots for some other. One way to choose automated techniques for inclusion in assisted tracing processes (i.e., the tracing processes that combine the expertise of a human analyst and special-purpose tracing software) is to select the techniques that tend to discover more links that are hard for human analysts to observe and establish on their own. This paper proposes a new measure of performance of a tracing method: human recoverability index-based recall. In the presence of knowledge about the difficulty of link recovery by human analysts, this measure rewards methods that are able to recover such links over methods that tend to recover the same links as the human analysts. We describe a TraceLab experiment we designed to evaluate automated trace recovery methods based on this measure and provide a case study of the use of this experiment to profile and evaluate different automated tracing techniques.
人体恢复指数:TraceLab实验
人们普遍认为,在给定的需求跟踪矩阵中,并非所有的跟踪链接都是相等的——人工分析人员和自动化方法都善于发现一些链接,但在其他一些链接上存在盲点。选择包含在辅助跟踪过程中的自动化技术的一种方法(例如,将人类分析人员的专业知识与专用跟踪软件相结合的跟踪过程)是选择倾向于发现更多人类分析人员难以观察和建立的链接的技术。本文提出了一种新的跟踪方法的性能度量:基于人类可恢复性指数的召回。在人类分析人员对链接恢复的难度有所了解的情况下,该措施奖励能够恢复此类链接的方法,而不是倾向于恢复与人类分析人员相同链接的方法。我们描述了一个TraceLab实验,我们设计了一个基于该测量来评估自动跟踪恢复方法的实验,并提供了一个使用该实验来分析和评估不同自动跟踪技术的案例研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信