Usage of the Kullback–Leibler divergence on posterior Dirichlet distributions to create a training dataset for a learning algorithm to classify driving behaviour events
M. Cesarini , E. Brentegani , G. Ceci , F. Cerreta , D. Messina , F. Petrarca , M. Robutti
{"title":"Usage of the Kullback–Leibler divergence on posterior Dirichlet distributions to create a training dataset for a learning algorithm to classify driving behaviour events","authors":"M. Cesarini , E. Brentegani , G. Ceci , F. Cerreta , D. Messina , F. Petrarca , M. Robutti","doi":"10.1016/j.jcmds.2023.100081","DOIUrl":null,"url":null,"abstract":"<div><p>Information theory uses the Kullback–Leibler divergence to compare distributions. In this paper, we apply it to bayesian posterior distributions and we show how it can be used to train a machine learning algorithm as well. The data sample used in this study is an OCTOTelematics set of driving behaviour data.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"8 ","pages":"Article 100081"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Mathematics and Data Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772415823000081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Information theory uses the Kullback–Leibler divergence to compare distributions. In this paper, we apply it to bayesian posterior distributions and we show how it can be used to train a machine learning algorithm as well. The data sample used in this study is an OCTOTelematics set of driving behaviour data.