{"title":"SASGBC","authors":"Congrui Wang, Zhen Huang, Minghao Hu","doi":"10.1145/3379247.3379266","DOIUrl":null,"url":null,"abstract":"Spoken language understanding (SLU) involves two tasks, namely slot filling and intent prediction. While the goal of slot filling is to predict correlated slot sequences, intent prediction aims to capture the intent for the whole utterance. Recent work shows that jointly learning both tasks can provide additional benefits. In this paper, we follow this line of work and explore the pre-trained BERT model for SLU. We find that BERT suffers from problems in learning logical dependency relation for slot filling. To address this issue, we propose a new joint learning model called SASGBC that combines sequence labeling with deep bidirectional Transformer. Experiments on two datasets demonstrate that SASGBC achieves state-of-the-art performance compared to several competitive baseline models.","PeriodicalId":410860,"journal":{"name":"Proceedings of 2020 the 6th International Conference on Computing and Data Engineering","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 2020 the 6th International Conference on Computing and Data Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3379247.3379266","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Spoken language understanding (SLU) involves two tasks, namely slot filling and intent prediction. While the goal of slot filling is to predict correlated slot sequences, intent prediction aims to capture the intent for the whole utterance. Recent work shows that jointly learning both tasks can provide additional benefits. In this paper, we follow this line of work and explore the pre-trained BERT model for SLU. We find that BERT suffers from problems in learning logical dependency relation for slot filling. To address this issue, we propose a new joint learning model called SASGBC that combines sequence labeling with deep bidirectional Transformer. Experiments on two datasets demonstrate that SASGBC achieves state-of-the-art performance compared to several competitive baseline models.