{"title":"提高胶囊神经网络的训练时间","authors":"Onyeachonam Dominic-Mario Chiadika, Moazhen Li, Jaewoong Choi","doi":"10.1109/HORA52670.2021.9461340","DOIUrl":null,"url":null,"abstract":"Chiadika Electrical & Computer Engr Mathematics department Electrical & Computer Engr Brunel University, Seoul National University Brunel University, London, United Kingded the Attention Routing CapsuleNet (AR CapsNet) as proposed by Jaewoong Choi et al. in Attention Routing between Capsules. The AR-CapsNet is an enhanced version of CapsNet which, uses a new and different routing and activation function. The unique routing style is Attention routing which, is simply capsules been routed, with the help an attention module and a fast-forward pass but, what is most important is that the spatial information is kept, which is the primary reason behind Capsules. Primarily, the in-built interpretation of the dynamic routing is finding a focal point of the prediction capsules. As well known, emphasis on preserving a vector orientation is what activation functions and its variant deal majorly on; the activation function used is capsule activation because it focuses is on how a capsule-scale activation function performs. The model was trained on the MNIST and CIFAR-10 datasets and classification tasked against AR CapsNet and CapsuleNet. The model showed almost accuracy with less training parameters and less training time.","PeriodicalId":270469,"journal":{"name":"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Improving Training time in Capsule Neural Network\",\"authors\":\"Onyeachonam Dominic-Mario Chiadika, Moazhen Li, Jaewoong Choi\",\"doi\":\"10.1109/HORA52670.2021.9461340\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Chiadika Electrical & Computer Engr Mathematics department Electrical & Computer Engr Brunel University, Seoul National University Brunel University, London, United Kingded the Attention Routing CapsuleNet (AR CapsNet) as proposed by Jaewoong Choi et al. in Attention Routing between Capsules. The AR-CapsNet is an enhanced version of CapsNet which, uses a new and different routing and activation function. The unique routing style is Attention routing which, is simply capsules been routed, with the help an attention module and a fast-forward pass but, what is most important is that the spatial information is kept, which is the primary reason behind Capsules. Primarily, the in-built interpretation of the dynamic routing is finding a focal point of the prediction capsules. As well known, emphasis on preserving a vector orientation is what activation functions and its variant deal majorly on; the activation function used is capsule activation because it focuses is on how a capsule-scale activation function performs. The model was trained on the MNIST and CIFAR-10 datasets and classification tasked against AR CapsNet and CapsuleNet. The model showed almost accuracy with less training parameters and less training time.\",\"PeriodicalId\":270469,\"journal\":{\"name\":\"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HORA52670.2021.9461340\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HORA52670.2021.9461340","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
由Jaewoong Choi等人在《capsule之间的Attention Routing》中提出的Attention Routing CapsuleNet (AR CapsNet)。AR-CapsNet是CapsNet的增强版本,它使用了新的和不同的路由和激活功能。独特的路由方式是注意路由,简单地说就是在一个注意模块和一个快进通道的帮助下,将胶囊路由,但最重要的是空间信息被保留,这是胶囊背后的主要原因。首先,动态路由的内置解释是找到预测胶囊的焦点。众所周知,强调保持矢量方向是激活函数及其变体主要处理的问题;使用的激活函数是胶囊激活,因为它关注的是胶囊级激活函数的执行方式。该模型在MNIST和CIFAR-10数据集上进行训练,并针对AR CapsNet和CapsuleNet进行分类任务。该模型在训练参数较少、训练时间较短的情况下具有较好的准确率。
Chiadika Electrical & Computer Engr Mathematics department Electrical & Computer Engr Brunel University, Seoul National University Brunel University, London, United Kingded the Attention Routing CapsuleNet (AR CapsNet) as proposed by Jaewoong Choi et al. in Attention Routing between Capsules. The AR-CapsNet is an enhanced version of CapsNet which, uses a new and different routing and activation function. The unique routing style is Attention routing which, is simply capsules been routed, with the help an attention module and a fast-forward pass but, what is most important is that the spatial information is kept, which is the primary reason behind Capsules. Primarily, the in-built interpretation of the dynamic routing is finding a focal point of the prediction capsules. As well known, emphasis on preserving a vector orientation is what activation functions and its variant deal majorly on; the activation function used is capsule activation because it focuses is on how a capsule-scale activation function performs. The model was trained on the MNIST and CIFAR-10 datasets and classification tasked against AR CapsNet and CapsuleNet. The model showed almost accuracy with less training parameters and less training time.