{"title":"Human performance estimating with analogy and regression models: an empirical validation","authors":"E. Stensrud, I. Myrtveit","doi":"10.1109/METRIC.1998.731247","DOIUrl":null,"url":null,"abstract":"Most cost estimation models seem to be validated without testing human performance and using data sets from custom software projects where the software typically is sized in lines of code (SLOC) or function points. From a practitioner's point of view this research seems not to address some important aspects of IT projects that we observe: estimating in an industrial environment is performed by people, not models; COTS projects are increasing their market share replacing traditional custom software projects; and industrial projects use a large variety of metrics to size the project deliverables and estimate the costs. Estimation by analogy tools like ANGEL and multiple regression analysis provide the necessary flexibility in terms of choice of input parameters. We describe an experiment to evaluate human performance where the subjects were aided by analogy and regression tools respectively. 68 partners and managers in Andersen Consulting estimated 48 different COTS projects. The results in terms of MMRE indicate that users benefit from both tools, however more from regression models than from analogy models as ANGEL. Furthermore, the performance of the ANGEL tool itself is not superior to the performance of the regression model. This result is contradictory to previous studies that claim that ANGEL outperforms multiple regression.","PeriodicalId":444081,"journal":{"name":"Proceedings Fifth International Software Metrics Symposium. Metrics (Cat. No.98TB100262)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1998-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"101","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Fifth International Software Metrics Symposium. Metrics (Cat. No.98TB100262)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/METRIC.1998.731247","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 101
Abstract
Most cost estimation models seem to be validated without testing human performance and using data sets from custom software projects where the software typically is sized in lines of code (SLOC) or function points. From a practitioner's point of view this research seems not to address some important aspects of IT projects that we observe: estimating in an industrial environment is performed by people, not models; COTS projects are increasing their market share replacing traditional custom software projects; and industrial projects use a large variety of metrics to size the project deliverables and estimate the costs. Estimation by analogy tools like ANGEL and multiple regression analysis provide the necessary flexibility in terms of choice of input parameters. We describe an experiment to evaluate human performance where the subjects were aided by analogy and regression tools respectively. 68 partners and managers in Andersen Consulting estimated 48 different COTS projects. The results in terms of MMRE indicate that users benefit from both tools, however more from regression models than from analogy models as ANGEL. Furthermore, the performance of the ANGEL tool itself is not superior to the performance of the regression model. This result is contradictory to previous studies that claim that ANGEL outperforms multiple regression.