{"title":"Variance Guided Continual Learning in a Convolutional Neural Network Gaussian Process Single Classifier Approach for Multiple Tasks in Noisy Images","authors":"Mahed Javed, L. Mihaylova, N. Bouaynaya","doi":"10.23919/fusion49465.2021.9626907","DOIUrl":null,"url":null,"abstract":"This work provides a continual learning solution in a single-classifier to multiple classification tasks with various data sets. A Gaussian process (GP) is combined with a Convolutional Neural Network (CNN) feature extractor architecture (CNNGP). Post softmax samples are used to estimate the variance. The variance is characterising the impact of uncertainties and is part of the update process for the learning rate parameters. Within the proposed framework two learning approaches are adopted: 1) in the first, the weights of the CNN are deterministic and only the GP learning rate is updated, 2) in the second setting, prior distributions are adopted for the CNN weights. Both the learning rates of the CNN and the GP are updated. The algorithm is trained on two variants of the MNIST dataset, split-MNIST and permuted-MNIST. Results are compared with the Uncertainty Guided Continual Bayesian Networks (UCB) multi-classifier approach [1]. The validation shows that the proposed algorithm in the Bayesian setting outperforms the UCB in tasks subject to Gaussian noise image noises and shows robustness.","PeriodicalId":226850,"journal":{"name":"2021 IEEE 24th International Conference on Information Fusion (FUSION)","volume":"55 5","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 24th International Conference on Information Fusion (FUSION)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/fusion49465.2021.9626907","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This work provides a continual learning solution in a single-classifier to multiple classification tasks with various data sets. A Gaussian process (GP) is combined with a Convolutional Neural Network (CNN) feature extractor architecture (CNNGP). Post softmax samples are used to estimate the variance. The variance is characterising the impact of uncertainties and is part of the update process for the learning rate parameters. Within the proposed framework two learning approaches are adopted: 1) in the first, the weights of the CNN are deterministic and only the GP learning rate is updated, 2) in the second setting, prior distributions are adopted for the CNN weights. Both the learning rates of the CNN and the GP are updated. The algorithm is trained on two variants of the MNIST dataset, split-MNIST and permuted-MNIST. Results are compared with the Uncertainty Guided Continual Bayesian Networks (UCB) multi-classifier approach [1]. The validation shows that the proposed algorithm in the Bayesian setting outperforms the UCB in tasks subject to Gaussian noise image noises and shows robustness.