RoBERTa自关注机制改进电子健康教育QAS

W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo
{"title":"RoBERTa自关注机制改进电子健康教育QAS","authors":"W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo","doi":"10.1109/ic2ie53219.2021.9649363","DOIUrl":null,"url":null,"abstract":"This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.","PeriodicalId":178443,"journal":{"name":"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Self-Attention Mechanism of RoBERTa to Improve QAS for e-health Education\",\"authors\":\"W. Suwarningsih, Raka Aditya Pratama, Fadhil Yusuf Rahadika, Mochamad Havid Albar Purnomo\",\"doi\":\"10.1109/ic2ie53219.2021.9649363\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.\",\"PeriodicalId\":178443,\"journal\":{\"name\":\"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ic2ie53219.2021.9649363\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 4th International Conference of Computer and Informatics Engineering (IC2IE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ic2ie53219.2021.9649363","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

本文旨在探讨以问答系统(QAS)的形式建立与当前COVID-19大流行相关的健康教育体系。QAS允许用户以自然语言问题的形式陈述信息需求,然后该系统将返回简短的文本引用或句子短语作为答案。这是由于当信息接受者能够回答他们脑海中可能出现的问题时,他们更容易理解新闻/信息的趋势。使用的方法是自注意机制,如鲁棒优化BERT预训练方法(RoBERTa),这是一种基于跨度训练的问题回答方法,预测答案开始的起始限制和答案索引的结束限制。使用835个非描述性问题的最终结果,对训练数据的最佳评价值为91.7%,F1值为93.3%。与复杂答案的描述问题相比,RoBERTa倾向于在非描述问题或简短答案的问题上显示更好的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Self-Attention Mechanism of RoBERTa to Improve QAS for e-health Education
This paper aims to discuss about the establishment of a health education system in the form of a Question Answer System (QAS) related to the current COVID-19 pandemic. QAS allows users to state information needs in the form of natural language questions, and then this system will return short text quotes or sentence phrases as answers. This is due to the tendency for recipients of information to more easily understand news/information when they can answer questions that may arise in their minds. The approach used was self-attention mechanism such as a Robustly Optimized BERT Pretraining Approach (RoBERTa), a method for question answering with span-based training that predicting the starting limit for answer start and the end limit for the answer index. The final results using 835 non-description questions, the best evaluation value on the training data showed the exact match of 91.7% and F1 value of 93.3%. RoBERTa tends to show the better results on non- description questions or questions with short answers compared to the description questions with complex answers.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信