{"title":"Extensive Huffman-tree-based Neural Network for the Imbalanced Dataset and Its Application in Accent Recognition","authors":"Jeremy Merrill, Yu Liang, Dalei Wu","doi":"10.1109/ICAIIC51459.2021.9415243","DOIUrl":null,"url":null,"abstract":"To classify the data-set featured with a large number of heavily imbalanced classes, this paper proposed an Extensive Huffman-Tree Neural Network (EHTNN), which fabricates multiple component neural network-enabled classifiers (e.g., CNN or SVM) using an extensive Huffman tree. Any given node in EHTNN can have arbitrary number of children. Compared with the Binary Huffman-Tree Neural Network (BHTNN), EHTNN may have smaller tree height, involve fewer neural networks, and demonstrate more flexibility on handling data imbalance. Using a 16-class exponentially imbalanced audio data-set as the benchmark, the proposed EHTNN was strictly assessed based on the comparisons with alternative methods such as BHTNN and single-layer CNN. The experimental results demonstrated promising results about EHTNN in terms of Gini index, Entropy value, and the accuracy derived from hierarchical multiclass confusion matrix.","PeriodicalId":432977,"journal":{"name":"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC51459.2021.9415243","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
To classify the data-set featured with a large number of heavily imbalanced classes, this paper proposed an Extensive Huffman-Tree Neural Network (EHTNN), which fabricates multiple component neural network-enabled classifiers (e.g., CNN or SVM) using an extensive Huffman tree. Any given node in EHTNN can have arbitrary number of children. Compared with the Binary Huffman-Tree Neural Network (BHTNN), EHTNN may have smaller tree height, involve fewer neural networks, and demonstrate more flexibility on handling data imbalance. Using a 16-class exponentially imbalanced audio data-set as the benchmark, the proposed EHTNN was strictly assessed based on the comparisons with alternative methods such as BHTNN and single-layer CNN. The experimental results demonstrated promising results about EHTNN in terms of Gini index, Entropy value, and the accuracy derived from hierarchical multiclass confusion matrix.