W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo
{"title":"RoBERTa自关注机制改进电子健康教育QAS","authors":"W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo","doi":"10.1109/ic2ie53219.2021.9649363","DOIUrl":null,"url":null,"abstract":"This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.","PeriodicalId":178443,"journal":{"name":"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Self-Attention Mechanism of RoBERTa to Improve QAS for e-health Education\",\"authors\":\"W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo\",\"doi\":\"10.1109/ic2ie53219.2021.9649363\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.\",\"PeriodicalId\":178443,\"journal\":{\"name\":\"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ic2ie53219.2021.9649363\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ic2ie53219.2021.9649363","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Self-Attention Mechanism of RoBERTa to Improve QAS for e-health Education
This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.