{"title":"基于贝叶斯方法的森林覆盖传感器数据增量批学习多类分类","authors":"V. D, L. Venkataramana, S. S, Sarah Mathew, S. V","doi":"10.1177/1063293X211058450","DOIUrl":null,"url":null,"abstract":"Deep neural networks can be used to perform nonlinear operations at multiple levels, such as a neural network that is composed of many hidden layers. Although deep learning approaches show good results, they have a drawback called catastrophic forgetting, which is a reduction in performance when a new class is added. Incremental learning is a learning method where existing knowledge should be retained even when new data is acquired. It involves learning with multiple batches of training data and the newer learning sessions do not require the data used in the previous iterations. The Bayesian approach to incremental learning uses the concept of the probability distribution of weights. The key idea of Bayes theorem is to find an updated distribution of weights and biases. In the Bayesian framework, the beliefs can be updated iteratively as the new data comes in. Bayesian framework allows to update the beliefs iteratively in real-time as data comes in. The Bayesian model for incremental learning showed an accuracy of 82%. The execution time for the Bayesian model was lesser on GPU (670 s) when compared to CPU (1165 s).","PeriodicalId":10680,"journal":{"name":"Concurrent Engineering","volume":"6 1","pages":"405 - 414"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bayesian approach to incremental batch learning on forest cover sensor data for multiclass classification\",\"authors\":\"V. D, L. Venkataramana, S. S, Sarah Mathew, S. V\",\"doi\":\"10.1177/1063293X211058450\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep neural networks can be used to perform nonlinear operations at multiple levels, such as a neural network that is composed of many hidden layers. Although deep learning approaches show good results, they have a drawback called catastrophic forgetting, which is a reduction in performance when a new class is added. Incremental learning is a learning method where existing knowledge should be retained even when new data is acquired. It involves learning with multiple batches of training data and the newer learning sessions do not require the data used in the previous iterations. The Bayesian approach to incremental learning uses the concept of the probability distribution of weights. The key idea of Bayes theorem is to find an updated distribution of weights and biases. In the Bayesian framework, the beliefs can be updated iteratively as the new data comes in. Bayesian framework allows to update the beliefs iteratively in real-time as data comes in. The Bayesian model for incremental learning showed an accuracy of 82%. The execution time for the Bayesian model was lesser on GPU (670 s) when compared to CPU (1165 s).\",\"PeriodicalId\":10680,\"journal\":{\"name\":\"Concurrent Engineering\",\"volume\":\"6 1\",\"pages\":\"405 - 414\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Concurrent Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/1063293X211058450\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Concurrent Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/1063293X211058450","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Bayesian approach to incremental batch learning on forest cover sensor data for multiclass classification
Deep neural networks can be used to perform nonlinear operations at multiple levels, such as a neural network that is composed of many hidden layers. Although deep learning approaches show good results, they have a drawback called catastrophic forgetting, which is a reduction in performance when a new class is added. Incremental learning is a learning method where existing knowledge should be retained even when new data is acquired. It involves learning with multiple batches of training data and the newer learning sessions do not require the data used in the previous iterations. The Bayesian approach to incremental learning uses the concept of the probability distribution of weights. The key idea of Bayes theorem is to find an updated distribution of weights and biases. In the Bayesian framework, the beliefs can be updated iteratively as the new data comes in. Bayesian framework allows to update the beliefs iteratively in real-time as data comes in. The Bayesian model for incremental learning showed an accuracy of 82%. The execution time for the Bayesian model was lesser on GPU (670 s) when compared to CPU (1165 s).