{"title":"Enhancing the Learning of Interval Type-2 Fuzzy Classifiers with Knowledge Distillation","authors":"Dorukhan Erdem, T. Kumbasar","doi":"10.1109/FUZZ45933.2021.9494471","DOIUrl":null,"url":null,"abstract":"Fuzzy Logic Systems (FLSs), especially Interval Type-2 (IT2) ones, are proven to achieve good results in various tasks, including classification problems. However, IT2-FLSs suffer from the curse of dimensionality problem, just like its Type-1 (T1) counterparts, and also training complexity since IT2-FLS have a large number of learnable parameters when compared to T1-FLSs. Deep learning (DL) architectures on the other hand can handle large learnable parameter sets for good generalizability but have their disadvantages. In this study, we present DL based approach with knowledge distillation for IT2-FLSs which transfers the generalizability features of deep models into IT2-FLS and increases its learning performance significantly by eliminating the problems that may arise from large input sizes and high rule counts. We present in detail the proposed approach with parameterization tricks so that the training of IT2-FLS can be accomplished straightforwardly within the widely employed DL frameworks without violating the definitions of IT2-FSs. We present comparative analysis to show the benefits of the inclusion knowledge distillation in the learning of IT2-FLSs with respect to rule number and input dimension size.","PeriodicalId":151289,"journal":{"name":"2021 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE)","volume":"103 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FUZZ45933.2021.9494471","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Fuzzy Logic Systems (FLSs), especially Interval Type-2 (IT2) ones, are proven to achieve good results in various tasks, including classification problems. However, IT2-FLSs suffer from the curse of dimensionality problem, just like its Type-1 (T1) counterparts, and also training complexity since IT2-FLS have a large number of learnable parameters when compared to T1-FLSs. Deep learning (DL) architectures on the other hand can handle large learnable parameter sets for good generalizability but have their disadvantages. In this study, we present DL based approach with knowledge distillation for IT2-FLSs which transfers the generalizability features of deep models into IT2-FLS and increases its learning performance significantly by eliminating the problems that may arise from large input sizes and high rule counts. We present in detail the proposed approach with parameterization tricks so that the training of IT2-FLS can be accomplished straightforwardly within the widely employed DL frameworks without violating the definitions of IT2-FSs. We present comparative analysis to show the benefits of the inclusion knowledge distillation in the learning of IT2-FLSs with respect to rule number and input dimension size.