{"title":"Fisher’s pioneering work on discriminant analysis and its impact on Artificial Intelligence","authors":"Kanti V. Mardia","doi":"10.1016/j.jmva.2024.105341","DOIUrl":null,"url":null,"abstract":"<div><p>Sir Ronald Aylmer Fisher opened many new areas in Multivariate Analysis, and the one which we will consider is discriminant analysis. Several papers by Fisher and others followed from his seminal paper in 1936 where he coined the name discrimination function. Historically, his four papers on discriminant analysis during 1936–1940 connect to the contemporaneous pioneering work of Hotelling and Mahalanobis. We revisit the famous iris data which Fisher used in his 1936 paper and in particular, test the hypothesis of multivariate normality for the data which he assumed. Fisher constructed his genetic discriminant motivated by this application and we provide a deeper insight into this construction; however, this construction has not been well understood as far as we know. We also indicate how the subject has developed along with the computer revolution, noting newer methods to carry out discriminant analysis, such as kernel classifiers, classification trees, support vector machines, neural networks, and deep learning. Overall, with computational power, the whole subject of Multivariate Analysis has changed its emphasis but the impact of this Fisher’s pioneering work continues as an integral part of supervised learning in Artificial Intelligence (AI).</p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":null,"pages":null},"PeriodicalIF":1.4000,"publicationDate":"2024-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X24000484","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
Sir Ronald Aylmer Fisher opened many new areas in Multivariate Analysis, and the one which we will consider is discriminant analysis. Several papers by Fisher and others followed from his seminal paper in 1936 where he coined the name discrimination function. Historically, his four papers on discriminant analysis during 1936–1940 connect to the contemporaneous pioneering work of Hotelling and Mahalanobis. We revisit the famous iris data which Fisher used in his 1936 paper and in particular, test the hypothesis of multivariate normality for the data which he assumed. Fisher constructed his genetic discriminant motivated by this application and we provide a deeper insight into this construction; however, this construction has not been well understood as far as we know. We also indicate how the subject has developed along with the computer revolution, noting newer methods to carry out discriminant analysis, such as kernel classifiers, classification trees, support vector machines, neural networks, and deep learning. Overall, with computational power, the whole subject of Multivariate Analysis has changed its emphasis but the impact of this Fisher’s pioneering work continues as an integral part of supervised learning in Artificial Intelligence (AI).
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.