{"title":"Statistical Methods for Actuaries.","authors":"P. Carroll","doi":"10.1017/S0020269X00010161","DOIUrl":null,"url":null,"abstract":"STATISTICAL METHODS are central to actuarial investigation. This emphasis has increased with the move away from a primary concern with life and pension matters, where the focus is on deterministic models. The increasing pressure on available space in the professional examination syllabus means that it is important that an appropriate selection is made from the available statistical methodology. Unfortunately actuarial problems are characterized by skewed distributions, correlated errors, situations which call for multiplicative rather than additive models, and a host of other 'non-standard' features. The paper comprises a wide ranging review of modern statistical methodology and an appraisal of the value of each method for actuarial investigations. An extension of the basic techniques already central to actuarial training is advocated. In particular the summary and display of data including multivariate data, the use of data transformations, distribution-free (non-parametric) inference, the use of a wider range of distributions, in particular the 'stable-law' family, and a greater appreciation of the secondary use of data from government and market research sources. All this is possible with only a limited extension of the calculus and algebra requirements for students and by exploiting modern computing resources. The paper then deals in detail with the major subject areas which should be considered part of professional or post-qualifying training. Multivariate methods have 'come alive' by virtue of computing power. No other body concerned with large data bases has ignored them. The uses of multiple regression, principal components, factor analysis, cluster analysis, multi-dimensional scaling, correspondence analysis, canonical variate analysis and discriminant function analysis are outlined. Examples of the use of each technique are described. Survival analysis has developed with increasing rapidity since Cox's 1972 paper on Regression Models and Life Tables. The principles of survival analysis are entirely consistent with traditional actuarial methods. The straightforward methods of estimating survival distributions for individual level data and the non-parametric testing of hypotheses are ideally suited to the examination syllabus. The full Cox model with its estimation difficulties is described in detail. An appreciation of these methods is essential for the preparation of life underwriting manuals using the recent literature in medical statistics. The whole","PeriodicalId":419781,"journal":{"name":"Journal of the Staple Inn Actuarial Society","volume":"112 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1987-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Staple Inn Actuarial Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/S0020269X00010161","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
STATISTICAL METHODS are central to actuarial investigation. This emphasis has increased with the move away from a primary concern with life and pension matters, where the focus is on deterministic models. The increasing pressure on available space in the professional examination syllabus means that it is important that an appropriate selection is made from the available statistical methodology. Unfortunately actuarial problems are characterized by skewed distributions, correlated errors, situations which call for multiplicative rather than additive models, and a host of other 'non-standard' features. The paper comprises a wide ranging review of modern statistical methodology and an appraisal of the value of each method for actuarial investigations. An extension of the basic techniques already central to actuarial training is advocated. In particular the summary and display of data including multivariate data, the use of data transformations, distribution-free (non-parametric) inference, the use of a wider range of distributions, in particular the 'stable-law' family, and a greater appreciation of the secondary use of data from government and market research sources. All this is possible with only a limited extension of the calculus and algebra requirements for students and by exploiting modern computing resources. The paper then deals in detail with the major subject areas which should be considered part of professional or post-qualifying training. Multivariate methods have 'come alive' by virtue of computing power. No other body concerned with large data bases has ignored them. The uses of multiple regression, principal components, factor analysis, cluster analysis, multi-dimensional scaling, correspondence analysis, canonical variate analysis and discriminant function analysis are outlined. Examples of the use of each technique are described. Survival analysis has developed with increasing rapidity since Cox's 1972 paper on Regression Models and Life Tables. The principles of survival analysis are entirely consistent with traditional actuarial methods. The straightforward methods of estimating survival distributions for individual level data and the non-parametric testing of hypotheses are ideally suited to the examination syllabus. The full Cox model with its estimation difficulties is described in detail. An appreciation of these methods is essential for the preparation of life underwriting manuals using the recent literature in medical statistics. The whole