{"title":"SE-RCN:经济型胶囊网络","authors":"Sami Naqvi, M. El-Sharkawy","doi":"10.1109/ICICT58900.2023.00017","DOIUrl":null,"url":null,"abstract":"As the Convolutional Neural Networks (CNNs) became more prominent in the field of Computer Vision (CV) their disadvantages gradually became apparent. By sharing transformation matrices between the different levels of a capsule, the Capsule Network (CapsNet) innovated the method of solving affine transformation problems. While the ResNet, it introduces skip connections, which makes deeper networks more powerful and solves the vanishing gradient problem. Fusing the advantageous ideas of CapsNet and ResNet with Squeeze and Excite (SE) block, this paper presents SE-Residual Capsule Network (SE-RCN), a neural network model. In the proposed model, skip connections and SE block take the place of the traditional convolutional layer of CapsNet, reducing the complexity of the network. Based on MNIST and CIFAR-10 datasets, the performance of the model is demonstrated with a substantial reduction in parameters when compared to similar neural networks.","PeriodicalId":425057,"journal":{"name":"2023 6th International Conference on Information and Computer Technologies (ICICT)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SE-RCN: An Economical Capsule Network\",\"authors\":\"Sami Naqvi, M. El-Sharkawy\",\"doi\":\"10.1109/ICICT58900.2023.00017\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As the Convolutional Neural Networks (CNNs) became more prominent in the field of Computer Vision (CV) their disadvantages gradually became apparent. By sharing transformation matrices between the different levels of a capsule, the Capsule Network (CapsNet) innovated the method of solving affine transformation problems. While the ResNet, it introduces skip connections, which makes deeper networks more powerful and solves the vanishing gradient problem. Fusing the advantageous ideas of CapsNet and ResNet with Squeeze and Excite (SE) block, this paper presents SE-Residual Capsule Network (SE-RCN), a neural network model. In the proposed model, skip connections and SE block take the place of the traditional convolutional layer of CapsNet, reducing the complexity of the network. Based on MNIST and CIFAR-10 datasets, the performance of the model is demonstrated with a substantial reduction in parameters when compared to similar neural networks.\",\"PeriodicalId\":425057,\"journal\":{\"name\":\"2023 6th International Conference on Information and Computer Technologies (ICICT)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 6th International Conference on Information and Computer Technologies (ICICT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICICT58900.2023.00017\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 6th International Conference on Information and Computer Technologies (ICICT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICICT58900.2023.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
As the Convolutional Neural Networks (CNNs) became more prominent in the field of Computer Vision (CV) their disadvantages gradually became apparent. By sharing transformation matrices between the different levels of a capsule, the Capsule Network (CapsNet) innovated the method of solving affine transformation problems. While the ResNet, it introduces skip connections, which makes deeper networks more powerful and solves the vanishing gradient problem. Fusing the advantageous ideas of CapsNet and ResNet with Squeeze and Excite (SE) block, this paper presents SE-Residual Capsule Network (SE-RCN), a neural network model. In the proposed model, skip connections and SE block take the place of the traditional convolutional layer of CapsNet, reducing the complexity of the network. Based on MNIST and CIFAR-10 datasets, the performance of the model is demonstrated with a substantial reduction in parameters when compared to similar neural networks.