{"title":"Defence against the modern arts: the curse of statistics: Part I—FRStat","authors":"Cedric Neumann","doi":"10.1093/lpr/mgaa004","DOIUrl":null,"url":null,"abstract":"For several decades, legal and scientific scholars have argued that conclusions from forensic examinations should be supported by statistical data and reported within a probabilistic framework. Multiple models have been proposed to quantify and express the probative value of forensic evidence. Unfortunately, the use of statistics to perform inferences in forensic science adds a layer of complexity that most forensic scientists, court offices and lay individuals are not armed to handle. Many applications of statistics to forensic science rely on ad hoc strategies and are not scientifically sound. The opacity of the technical jargon that is used to describe these probabilistic models and their results, and the complexity of the techniques involved make it very difficult for the untrained user to separate the wheat from the chaff. This series of article is intended to help forensic scientists and lawyers recognize limitations and issues in tools proposed to interpret the results of forensic examinations. This article focuses on the tool proposed by the Latent Print Branch of the U.S. Defense Forensic Science Center (DFSC) and called FRStat. In this article, I explore the compatibility of the results outputted by FRStat with the language used by the DFCS to report the conclusions of their fingerprint examinations, as well as the appropriateness of the statistical modelling underpinning the tool and the validation of its performance.","PeriodicalId":48724,"journal":{"name":"Law Probability & Risk","volume":"19 1","pages":"1-20"},"PeriodicalIF":1.4000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/lpr/mgaa004","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Law Probability & Risk","FirstCategoryId":"100","ListUrlMain":"https://ieeexplore.ieee.org/document/9254200/","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"LAW","Score":null,"Total":0}
引用次数: 3
Abstract
For several decades, legal and scientific scholars have argued that conclusions from forensic examinations should be supported by statistical data and reported within a probabilistic framework. Multiple models have been proposed to quantify and express the probative value of forensic evidence. Unfortunately, the use of statistics to perform inferences in forensic science adds a layer of complexity that most forensic scientists, court offices and lay individuals are not armed to handle. Many applications of statistics to forensic science rely on ad hoc strategies and are not scientifically sound. The opacity of the technical jargon that is used to describe these probabilistic models and their results, and the complexity of the techniques involved make it very difficult for the untrained user to separate the wheat from the chaff. This series of article is intended to help forensic scientists and lawyers recognize limitations and issues in tools proposed to interpret the results of forensic examinations. This article focuses on the tool proposed by the Latent Print Branch of the U.S. Defense Forensic Science Center (DFSC) and called FRStat. In this article, I explore the compatibility of the results outputted by FRStat with the language used by the DFCS to report the conclusions of their fingerprint examinations, as well as the appropriateness of the statistical modelling underpinning the tool and the validation of its performance.
期刊介绍:
Law, Probability & Risk is a fully refereed journal which publishes papers dealing with topics on the interface of law and probabilistic reasoning. These are interpreted broadly to include aspects relevant to the interpretation of scientific evidence, the assessment of uncertainty and the assessment of risk. The readership includes academic lawyers, mathematicians, statisticians and social scientists with interests in quantitative reasoning.
The primary objective of the journal is to cover issues in law, which have a scientific element, with an emphasis on statistical and probabilistic issues and the assessment of risk.
Examples of topics which may be covered include communications law, computers and the law, environmental law, law and medicine, regulatory law for science and technology, identification problems (such as DNA but including other materials), sampling issues (drugs, computer pornography, fraud), offender profiling, credit scoring, risk assessment, the role of statistics and probability in drafting legislation, the assessment of competing theories of evidence (possibly with a view to forming an optimal combination of them). In addition, a whole new area is emerging in the application of computers to medicine and other safety-critical areas. New legislation is required to define the responsibility of computer experts who develop software for tackling these safety-critical problems.