{"title":"Amplitude Suppression and Direction Activation in Networks for 1-bit Faster R-CNN","authors":"Sheng Xu, Zhendong Liu, Xuan Gong, Chunlei Liu, Mingyuan Mao, Baochang Zhang","doi":"10.1145/3410338.3412340","DOIUrl":null,"url":null,"abstract":"Recent advances in object detection have been driven by the success of deep convolutional neural networks (DCNNs). Deploying a DCNN detector on resource-limited hardware such as embedded devices and smart phones, however, remains challenging due to the massive number of parameters a typical model contains. In this paper, we propose an amplitude suppression and direction activation for Faster R-CNN (ASDA-FRCNN) framework to significantly compress DCNNs for highly efficient performance. The shared amplitude between the full-precision and the binary kernels can be significantly suppressed through a simple but effective loss, which is then incorporated into the existing Faster R-CNN detector. Furthermore, the ASDA module is generic and flexible to be incorporated into existing DCNNs for different tasks. Experiments demonstrate the superiority of 1-bit ASDA-FRCNN which achieves superior performance on various datasets. Specifically, ASDA-FRCNN shows the best speed-accuracy trade off with 63.4% at estimated 711 FPS and 19.4% mAP at and estimated 362 FPS with ResNet-18 on the PASCAL VOC 2007 and MS COCO validation datasets respectively, which demonstrate the superior performance and strong generalization of our method.","PeriodicalId":401260,"journal":{"name":"Proceedings of the 4th International Workshop on Embedded and Mobile Deep Learning","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Workshop on Embedded and Mobile Deep Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3410338.3412340","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Recent advances in object detection have been driven by the success of deep convolutional neural networks (DCNNs). Deploying a DCNN detector on resource-limited hardware such as embedded devices and smart phones, however, remains challenging due to the massive number of parameters a typical model contains. In this paper, we propose an amplitude suppression and direction activation for Faster R-CNN (ASDA-FRCNN) framework to significantly compress DCNNs for highly efficient performance. The shared amplitude between the full-precision and the binary kernels can be significantly suppressed through a simple but effective loss, which is then incorporated into the existing Faster R-CNN detector. Furthermore, the ASDA module is generic and flexible to be incorporated into existing DCNNs for different tasks. Experiments demonstrate the superiority of 1-bit ASDA-FRCNN which achieves superior performance on various datasets. Specifically, ASDA-FRCNN shows the best speed-accuracy trade off with 63.4% at estimated 711 FPS and 19.4% mAP at and estimated 362 FPS with ResNet-18 on the PASCAL VOC 2007 and MS COCO validation datasets respectively, which demonstrate the superior performance and strong generalization of our method.