{"title":"Orthogonal Capsule Networks With Positional Information Preservation and Lightweight Feature Learning.","authors":"Yuerong Xue","doi":"10.1109/TNNLS.2024.3443814","DOIUrl":null,"url":null,"abstract":"<p><p>Both transformer and convolutional neural network (CNN) models require supplementary elements to acquire positional information. To address this issue, we propose a novel orthogonal capsule network (OthogonalCaps) that preserves location information during lightweight feature learning. The proposed network simplifies complex training processes and enables end-to-end training for object detection tasks. Specifically, there is no need to solve the regression problem of positions and the classification problem of objects separately, nor is there a need to encode the positional information as an additional token, as in transformer models. We generate the next capsule layer via orthogonality-based dynamic routing, which reduces the number of parameters and preserves positional information via its voting mechanism. Moreover, we propose Capsule ReLU as an activation function to avoid the problem of gradient vanishing and to facilitate capsule normalization across various scales, thus empowering OrthogonalCaps to better adapt to objects of diverse scales. The orthogonal capsule network (CapsNet) demonstrates an accuracy and run-time performance on a par with those of Faster R-CNN on the VOC dataset. Our network outperforms the baseline approach in detecting small-scale samples. The simulation results suggest that the proposed network surpasses other capsule network models in achieving a favorable balance between parameters and accuracy. Furthermore, an ablation experiment indicates that both Capsule ReLU and orthogonality-based dynamic routing play essential roles in enhancing the classification performance. The training code and pretrained models are available at https://github.com/l1ack/OrthogonalCaps.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":null,"pages":null},"PeriodicalIF":10.2000,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2024.3443814","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Both transformer and convolutional neural network (CNN) models require supplementary elements to acquire positional information. To address this issue, we propose a novel orthogonal capsule network (OthogonalCaps) that preserves location information during lightweight feature learning. The proposed network simplifies complex training processes and enables end-to-end training for object detection tasks. Specifically, there is no need to solve the regression problem of positions and the classification problem of objects separately, nor is there a need to encode the positional information as an additional token, as in transformer models. We generate the next capsule layer via orthogonality-based dynamic routing, which reduces the number of parameters and preserves positional information via its voting mechanism. Moreover, we propose Capsule ReLU as an activation function to avoid the problem of gradient vanishing and to facilitate capsule normalization across various scales, thus empowering OrthogonalCaps to better adapt to objects of diverse scales. The orthogonal capsule network (CapsNet) demonstrates an accuracy and run-time performance on a par with those of Faster R-CNN on the VOC dataset. Our network outperforms the baseline approach in detecting small-scale samples. The simulation results suggest that the proposed network surpasses other capsule network models in achieving a favorable balance between parameters and accuracy. Furthermore, an ablation experiment indicates that both Capsule ReLU and orthogonality-based dynamic routing play essential roles in enhancing the classification performance. The training code and pretrained models are available at https://github.com/l1ack/OrthogonalCaps.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.