K-CSRL:知识增强会话语义角色标注

Boyu He, Han Wu, Congduan Li, Linqi Song, Weigang Chen
{"title":"K-CSRL:知识增强会话语义角色标注","authors":"Boyu He, Han Wu, Congduan Li, Linqi Song, Weigang Chen","doi":"10.1145/3457682.3457763","DOIUrl":null,"url":null,"abstract":"Semantic role labeling (SRL) is widely used to extract predicate-argument pairs from sentences. Traditional SRL methods can perform well on the single sentence but fail to work in dialogue scenario where ellipsis and anaphora frequently occurs. Some research work has been proposed to solve this problem, i.e. Conversational Semantic Role Labeling (CSRL), but there are still huge room for improvements. The error case study of BERT-based CSRL model has shown that the majority of the errors are observed in boundary matching, especially in entity mention detection. We think the premier cause of this kind of error is the deficiency of external knowledge such that the ill-informed model cannot correctly capture and correlate the entities. To this end, we propose to incorporate external knowledge into BERT using visible masking strategy. We evaluate our proposed model on DuConv dataset. Experimental results show that our model with knowledge enhancement outperforms the benchmarks. Further analysis also demonstrates that dialogue SRL can benefit from external knowledge.","PeriodicalId":142045,"journal":{"name":"2021 13th International Conference on Machine Learning and Computing","volume":"78 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"K-CSRL: Knowledge Enhanced Conversational Semantic Role Labeling\",\"authors\":\"Boyu He, Han Wu, Congduan Li, Linqi Song, Weigang Chen\",\"doi\":\"10.1145/3457682.3457763\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Semantic role labeling (SRL) is widely used to extract predicate-argument pairs from sentences. Traditional SRL methods can perform well on the single sentence but fail to work in dialogue scenario where ellipsis and anaphora frequently occurs. Some research work has been proposed to solve this problem, i.e. Conversational Semantic Role Labeling (CSRL), but there are still huge room for improvements. The error case study of BERT-based CSRL model has shown that the majority of the errors are observed in boundary matching, especially in entity mention detection. We think the premier cause of this kind of error is the deficiency of external knowledge such that the ill-informed model cannot correctly capture and correlate the entities. To this end, we propose to incorporate external knowledge into BERT using visible masking strategy. We evaluate our proposed model on DuConv dataset. Experimental results show that our model with knowledge enhancement outperforms the benchmarks. Further analysis also demonstrates that dialogue SRL can benefit from external knowledge.\",\"PeriodicalId\":142045,\"journal\":{\"name\":\"2021 13th International Conference on Machine Learning and Computing\",\"volume\":\"78 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-02-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 13th International Conference on Machine Learning and Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3457682.3457763\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 13th International Conference on Machine Learning and Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3457682.3457763","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

语义角色标注(SRL)被广泛用于从句子中提取谓词-参数对。传统的SRL方法可以很好地处理单句,但在省略和回指频繁出现的对话场景中效果不佳。为了解决这一问题,已经提出了一些研究工作,即会话语义角色标记(CSRL),但仍有很大的改进空间。基于bert的CSRL模型误差案例研究表明,大部分误差出现在边界匹配中,尤其是实体提及检测中。我们认为这种错误的主要原因是外部知识的缺乏,以至于信息不灵通的模型不能正确地捕获和关联实体。为此,我们提出使用可见掩蔽策略将外部知识纳入BERT。我们在DuConv数据集上评估了我们提出的模型。实验结果表明,我们的知识增强模型优于基准测试。进一步的分析还表明,对话SRL可以从外部知识中获益。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
K-CSRL: Knowledge Enhanced Conversational Semantic Role Labeling
Semantic role labeling (SRL) is widely used to extract predicate-argument pairs from sentences. Traditional SRL methods can perform well on the single sentence but fail to work in dialogue scenario where ellipsis and anaphora frequently occurs. Some research work has been proposed to solve this problem, i.e. Conversational Semantic Role Labeling (CSRL), but there are still huge room for improvements. The error case study of BERT-based CSRL model has shown that the majority of the errors are observed in boundary matching, especially in entity mention detection. We think the premier cause of this kind of error is the deficiency of external knowledge such that the ill-informed model cannot correctly capture and correlate the entities. To this end, we propose to incorporate external knowledge into BERT using visible masking strategy. We evaluate our proposed model on DuConv dataset. Experimental results show that our model with knowledge enhancement outperforms the benchmarks. Further analysis also demonstrates that dialogue SRL can benefit from external knowledge.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信