{"title":"Knowledge Distillation based Lightweight Adaptive Graph Convolutional Network for Skeleton-based action recognition","authors":"Zhongwei Qiu, Hongbo Zhang, Qing Lei, Jixiang Du","doi":"10.1109/ITME53901.2021.00045","DOIUrl":null,"url":null,"abstract":"Skeleton-based human action recognition has received extensive attention due to its easy access to human skeleton data. However, the current mainstream skeleton-based action recognition methods have more or less the problem of overlarge parameters, which makes it difficult for these methods to meet the requirements of timeliness and accuracy. To solve this problem, we improve attention-enhanced adaptive graph convolutional neural network (AAGCN) to obtain a high-precision improved AAGCN (IAAGCN), and use it as teacher model to conduct knowledge distillation of our lightweight IAAGCN (LIAAGCN). The results of the tests on the NTU-RGBD dataset are validated by knowledge distillation to allow LIAAGCN to maintain good accuracy while keeping the parameters small.","PeriodicalId":6774,"journal":{"name":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","volume":"16 1","pages":"180-184"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 11th International Conference on Information Technology in Medicine and Education (ITME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITME53901.2021.00045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Skeleton-based human action recognition has received extensive attention due to its easy access to human skeleton data. However, the current mainstream skeleton-based action recognition methods have more or less the problem of overlarge parameters, which makes it difficult for these methods to meet the requirements of timeliness and accuracy. To solve this problem, we improve attention-enhanced adaptive graph convolutional neural network (AAGCN) to obtain a high-precision improved AAGCN (IAAGCN), and use it as teacher model to conduct knowledge distillation of our lightweight IAAGCN (LIAAGCN). The results of the tests on the NTU-RGBD dataset are validated by knowledge distillation to allow LIAAGCN to maintain good accuracy while keeping the parameters small.