A Transformer based Multi-task Model for Domain Classification, Intent Detection and Slot-Filling

Tulika Saha, N. Priya, S. Saha, P. Bhattacharyya
{"title":"A Transformer based Multi-task Model for Domain Classification, Intent Detection and Slot-Filling","authors":"Tulika Saha, N. Priya, S. Saha, P. Bhattacharyya","doi":"10.1109/IJCNN52387.2021.9533525","DOIUrl":null,"url":null,"abstract":"With the ever increasing complexity of the user queries in a multi-domain based task-oriented dialogue system, it is imperative to facilitate robust Spoken Language Understanding (SLU) modules that perform multiple tasks in an unified way. In this paper, we present a novel multi-task approach for the joint modelling of three tasks together, namely, Domain Classification, Intent Detection and Slot-Filling. We hypothesize with the intuition that the cross dependencies of all these three tasks mutually help each other towards their representations and classifications which further simplify the SLU module in a multi-domain scenario. Towards this end, we propose a BERT language model based multi-task framework utilizing capsule networks and conditional random fields for addressing the classification and sequence labeling problems, respectively, for different tasks. Experimental results indicate that the proposed multi-task model outperformed several strong baselines and its single task counterparts on three benchmark datasets of different domains and attained state-of-the-art results on different tasks.","PeriodicalId":396583,"journal":{"name":"2021 International Joint Conference on Neural Networks (IJCNN)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN52387.2021.9533525","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

With the ever increasing complexity of the user queries in a multi-domain based task-oriented dialogue system, it is imperative to facilitate robust Spoken Language Understanding (SLU) modules that perform multiple tasks in an unified way. In this paper, we present a novel multi-task approach for the joint modelling of three tasks together, namely, Domain Classification, Intent Detection and Slot-Filling. We hypothesize with the intuition that the cross dependencies of all these three tasks mutually help each other towards their representations and classifications which further simplify the SLU module in a multi-domain scenario. Towards this end, we propose a BERT language model based multi-task framework utilizing capsule networks and conditional random fields for addressing the classification and sequence labeling problems, respectively, for different tasks. Experimental results indicate that the proposed multi-task model outperformed several strong baselines and its single task counterparts on three benchmark datasets of different domains and attained state-of-the-art results on different tasks.
基于Transformer的领域分类、意图检测和槽填充多任务模型
在基于多域的面向任务的对话系统中,随着用户查询的复杂性日益增加,需要建立健壮的、能够统一执行多个任务的口语理解(SLU)模块。本文提出了一种新的多任务建模方法,将领域分类、意图检测和缝隙填充这三个任务联合在一起进行建模。我们凭直觉假设,所有这三个任务的交叉依赖关系相互帮助,以实现它们的表示和分类,从而进一步简化多域场景中的SLU模块。为此,我们提出了一个基于BERT语言模型的多任务框架,利用胶囊网络和条件随机场分别解决不同任务的分类和序列标记问题。实验结果表明,所提出的多任务模型在不同领域的三个基准数据集上表现优于多个强基线和单任务模型,并在不同任务上获得了最先进的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信