{"title":"A bio- medical Question Answering system for the Malayalam language using word embedding and Bidirectional Encoder Representation from Transformers.","authors":"","doi":"10.4018/ijsppc.302009","DOIUrl":null,"url":null,"abstract":"Conversational search is the dominant intent of Question Answering, which is achieved through different NLP techniques, Deep Learning models. The advent of Transformer models has been a breakthrough in Natural Language Processing applications, which has attained a benchmark on state of NLP tasks such as question answering. Here we propose a semantic Malayalam Question Answering system that automatically answers the queries related to health issues. The Biomedical Question-Answering, especially in the Malayalam language, is a tedious and challenging task. The proposed model uses a neural network-based Bidirectional Encoder Representation from Transformers (BERT), to implement the question answering system. In this study, we investigate how to train and fine-tune a BERT model for Question-Answering. The system has been tested with our own annotated Malayalam SQUAD form health dataset. In comparing the result with our previous works - Word embedding and RNN based model, identified we find that our BERT model is more accurate than the previous models and achieves an F1 score of 86%.","PeriodicalId":278930,"journal":{"name":"International Journal of Security and Privacy in Pervasive Computing","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Security and Privacy in Pervasive Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijsppc.302009","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Conversational search is the dominant intent of Question Answering, which is achieved through different NLP techniques, Deep Learning models. The advent of Transformer models has been a breakthrough in Natural Language Processing applications, which has attained a benchmark on state of NLP tasks such as question answering. Here we propose a semantic Malayalam Question Answering system that automatically answers the queries related to health issues. The Biomedical Question-Answering, especially in the Malayalam language, is a tedious and challenging task. The proposed model uses a neural network-based Bidirectional Encoder Representation from Transformers (BERT), to implement the question answering system. In this study, we investigate how to train and fine-tune a BERT model for Question-Answering. The system has been tested with our own annotated Malayalam SQUAD form health dataset. In comparing the result with our previous works - Word embedding and RNN based model, identified we find that our BERT model is more accurate than the previous models and achieves an F1 score of 86%.