{"title":"The Role of Activation Function in Neural NER for a Large Semantically Annotated Corpus","authors":"Muhammad Saad Amin, Luca Anselma, A. Mazzei","doi":"10.1109/ETECTE55893.2022.10007317","DOIUrl":null,"url":null,"abstract":"Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.","PeriodicalId":131572,"journal":{"name":"2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ETECTE55893.2022.10007317","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.