{"title":"Defence against the modern arts: the curse of statistics—Part II: ‘Score-based likelihood ratios’","authors":"Cedric Neumann;Madeline Ausdemore","doi":"10.1093/lpr/mgaa006","DOIUrl":null,"url":null,"abstract":"For several decades, legal and scientific scholars have argued that conclusions from forensic examinations should be supported by statistical data and reported within a probabilistic framework. Multiple models have been proposed to quantify and express the probative value of forensic evidence. Unfortunately, the use of statistics to perform inferences in forensic science adds a layer of complexity that most forensic scientists, court officers and lay individuals are not armed to handle. Many applications of statistics to forensic science rely on ad-hoc strategies and are not scientifically sound. The opacity of the technical jargon used to describe probabilistic models and their results, and the complexity of the techniques involved make it very difficult for the untrained user to separate the wheat from the chaff. This series of papers is intended to help forensic scientists and lawyers recognize limitations and issues in tools proposed to interpret the results of forensic examinations. This article focuses on tools that have been proposed to leverage the use of similarity scores to assess the probative value of forensic findings. We call this family of tools ‘score-based likelihood ratios’. In this article, we present the fundamental concepts on which these tools are built, we describe some specific members of this family of tools, and we compare them explore to the Bayes factor through an intuitive geometrical approach and through simulations. Finally, we discuss their validation and their potential usefulness as a decision-making tool in forensic science.","PeriodicalId":48724,"journal":{"name":"Law Probability & Risk","volume":"19 1","pages":"21-42"},"PeriodicalIF":1.4000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/lpr/mgaa006","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Law Probability & Risk","FirstCategoryId":"100","ListUrlMain":"https://ieeexplore.ieee.org/document/9254201/","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"LAW","Score":null,"Total":0}
引用次数: 17
Abstract
For several decades, legal and scientific scholars have argued that conclusions from forensic examinations should be supported by statistical data and reported within a probabilistic framework. Multiple models have been proposed to quantify and express the probative value of forensic evidence. Unfortunately, the use of statistics to perform inferences in forensic science adds a layer of complexity that most forensic scientists, court officers and lay individuals are not armed to handle. Many applications of statistics to forensic science rely on ad-hoc strategies and are not scientifically sound. The opacity of the technical jargon used to describe probabilistic models and their results, and the complexity of the techniques involved make it very difficult for the untrained user to separate the wheat from the chaff. This series of papers is intended to help forensic scientists and lawyers recognize limitations and issues in tools proposed to interpret the results of forensic examinations. This article focuses on tools that have been proposed to leverage the use of similarity scores to assess the probative value of forensic findings. We call this family of tools ‘score-based likelihood ratios’. In this article, we present the fundamental concepts on which these tools are built, we describe some specific members of this family of tools, and we compare them explore to the Bayes factor through an intuitive geometrical approach and through simulations. Finally, we discuss their validation and their potential usefulness as a decision-making tool in forensic science.
期刊介绍:
Law, Probability & Risk is a fully refereed journal which publishes papers dealing with topics on the interface of law and probabilistic reasoning. These are interpreted broadly to include aspects relevant to the interpretation of scientific evidence, the assessment of uncertainty and the assessment of risk. The readership includes academic lawyers, mathematicians, statisticians and social scientists with interests in quantitative reasoning.
The primary objective of the journal is to cover issues in law, which have a scientific element, with an emphasis on statistical and probabilistic issues and the assessment of risk.
Examples of topics which may be covered include communications law, computers and the law, environmental law, law and medicine, regulatory law for science and technology, identification problems (such as DNA but including other materials), sampling issues (drugs, computer pornography, fraud), offender profiling, credit scoring, risk assessment, the role of statistics and probability in drafting legislation, the assessment of competing theories of evidence (possibly with a view to forming an optimal combination of them). In addition, a whole new area is emerging in the application of computers to medicine and other safety-critical areas. New legislation is required to define the responsibility of computer experts who develop software for tackling these safety-critical problems.