{"title":"A programming paradigm for machine learning, with a case study of Bayesian networks","authors":"L. Allison","doi":"10.1145/1151699.1151712","DOIUrl":null,"url":null,"abstract":"Inductive programming is a new machine learning paradigm which combines functional programming for writing statistical models and information theory to prevent overfitting, Type-classes specify general properties that models must have. Many statistical models, estimators and operators have polymorphic types. Useful operators combine models, and estimators, to form new ones; Functional programmings's compositional style of programming is a great advantage in this domain, Complementing this, information theory provides a compositional measure of the complexity of a model from its parts.Inductive programming is illustrated by a case study of Bayesian networks, Networks are built from classification- (decision-) trees. Trees are built from partioning functions and models on data-spaces. Trees, and hence networks, are general as a natural consequence of the method. Discrete and continious variables, and missing values are handled by the networks. Finally the Bayesian networks are applied to a challenging data set on lost persons.","PeriodicalId":136130,"journal":{"name":"Australasian Computer Science Conference","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Australasian Computer Science Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1151699.1151712","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Inductive programming is a new machine learning paradigm which combines functional programming for writing statistical models and information theory to prevent overfitting, Type-classes specify general properties that models must have. Many statistical models, estimators and operators have polymorphic types. Useful operators combine models, and estimators, to form new ones; Functional programmings's compositional style of programming is a great advantage in this domain, Complementing this, information theory provides a compositional measure of the complexity of a model from its parts.Inductive programming is illustrated by a case study of Bayesian networks, Networks are built from classification- (decision-) trees. Trees are built from partioning functions and models on data-spaces. Trees, and hence networks, are general as a natural consequence of the method. Discrete and continious variables, and missing values are handled by the networks. Finally the Bayesian networks are applied to a challenging data set on lost persons.