{"title":"Lightweight Network for Vietnamese Landmark Recognition based on Knowledge Distillation","authors":"V. T. Tran, Nam Le, P. T. Nguyen, Thinh N. Doan","doi":"10.1109/NICS51282.2020.9335881","DOIUrl":null,"url":null,"abstract":"In our modern world, smart devices, e.g., mobile phones, IoT devices, have become the norm, leading to a vast increase in demand for a smart-ecosystem. Among other technologies that are being researched and applied, there is a trend of embedding Artificial Intelligence modules on these devices. One of the most challenging problems for embedding on smart devices is maintaining good accuracy while reducing the computational cost and speed. State-of-the-art Deep Convolution Neural Networks cannot run on smart devices due to a lack of resources. The need to find such a model is the motivation for our proposal of a lightweight network for landmark recognition using knowledge distillation. Our purpose is not to create a network with higher accuracy; instead, we try to devise a fast and light neural network while keeping approximately similar accuracy of SOTA models by utilizing knowledge distillation. Our proposed student model achieves a decent result with 7.33% accuracy lower than the teacher SOTA model (91.8%), while decreases the processing time by 73.04%. Our experimental results show promising potential for further explorations and research in knowledge distillation. We have also collected a dataset for Vietnam landmarks for our experiments. This data can be used to train a similar network for Vietnam landmarks recognition or other related purposes.","PeriodicalId":308944,"journal":{"name":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","volume":"02 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NICS51282.2020.9335881","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In our modern world, smart devices, e.g., mobile phones, IoT devices, have become the norm, leading to a vast increase in demand for a smart-ecosystem. Among other technologies that are being researched and applied, there is a trend of embedding Artificial Intelligence modules on these devices. One of the most challenging problems for embedding on smart devices is maintaining good accuracy while reducing the computational cost and speed. State-of-the-art Deep Convolution Neural Networks cannot run on smart devices due to a lack of resources. The need to find such a model is the motivation for our proposal of a lightweight network for landmark recognition using knowledge distillation. Our purpose is not to create a network with higher accuracy; instead, we try to devise a fast and light neural network while keeping approximately similar accuracy of SOTA models by utilizing knowledge distillation. Our proposed student model achieves a decent result with 7.33% accuracy lower than the teacher SOTA model (91.8%), while decreases the processing time by 73.04%. Our experimental results show promising potential for further explorations and research in knowledge distillation. We have also collected a dataset for Vietnam landmarks for our experiments. This data can be used to train a similar network for Vietnam landmarks recognition or other related purposes.