Ehsan Emad Marvasti, Amir Emad Marvasti, H. Foroosh
{"title":"Exploiting Symmetries of Distributions in CNNs and Folded Coding","authors":"Ehsan Emad Marvasti, Amir Emad Marvasti, H. Foroosh","doi":"10.1109/CRV.2018.00017","DOIUrl":null,"url":null,"abstract":"We introduce the concept of Folded Coding\" for continuous univariate distributions estimating the distribution and coding the samples simultaneously. Folded Coding assumes symmetries in the distribution and requires significantly fewer parameters compared to conventional models when the symmetry assumption is satisfied. We incorporate the mechanics of Folded Coding into Convolutional Neural Networks (CNN) in the form of layers referred to as Binary Expanded ReLU (BEReLU) Shared Convolutions and Instance Fully Connected (I-FC). BEReLU and I-FC force the network to have symmetric functionality in the space of samples. Therefore similar patterns of prediction are applied to sections of the space where the model does not have observed samples. We experimented with BEReLU on generic networks using different parameter sizes on CIFAR-10 and CIFAR-100. Our experiments show increased accuracy of the models equipped with the BEReLU layer when there are fewer parameters. The performance of the models with BEReLU layer remains similar to original network with the increase of parameter number. The experiments provide further evidence that estimation of distribution symmetry is part of CNNs' functionality.","PeriodicalId":281779,"journal":{"name":"2018 15th Conference on Computer and Robot Vision (CRV)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 15th Conference on Computer and Robot Vision (CRV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CRV.2018.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We introduce the concept of Folded Coding" for continuous univariate distributions estimating the distribution and coding the samples simultaneously. Folded Coding assumes symmetries in the distribution and requires significantly fewer parameters compared to conventional models when the symmetry assumption is satisfied. We incorporate the mechanics of Folded Coding into Convolutional Neural Networks (CNN) in the form of layers referred to as Binary Expanded ReLU (BEReLU) Shared Convolutions and Instance Fully Connected (I-FC). BEReLU and I-FC force the network to have symmetric functionality in the space of samples. Therefore similar patterns of prediction are applied to sections of the space where the model does not have observed samples. We experimented with BEReLU on generic networks using different parameter sizes on CIFAR-10 and CIFAR-100. Our experiments show increased accuracy of the models equipped with the BEReLU layer when there are fewer parameters. The performance of the models with BEReLU layer remains similar to original network with the increase of parameter number. The experiments provide further evidence that estimation of distribution symmetry is part of CNNs' functionality.