{"title":"一个以任务为导向的对话机器人,使用长短期记忆和注意力学习泰语","authors":"Ramon Robloke, B. Kijsirikul","doi":"10.1145/3373477.3373704","DOIUrl":null,"url":null,"abstract":"A task-oriented dialogue bot helps users achieve a predefined goal within a closed domain. A neural-network based dialogue bot tracks the user intention in each action, which can reach promising performance compared to a hand-crafted baseline [1] and has a more flexible conversational flow. One such end-to-end architecture is the Hybrid Code Networks (HCNs) [2]. It uses the simulated conversation of human-bot in the domain of restaurant booking to train an LSTM to track dialogue states and predict the next bot response. This research proposes a similar architecture to HCNs with the addition of attention to LSTM [3]. The best results are obtained by our model on both original and Thai translated versions of bAbI task 5.","PeriodicalId":300431,"journal":{"name":"Proceedings of the 1st International Conference on Advanced Information Science and System","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A task-oriented dialogue bot using long short-term memory with attention for Thai language\",\"authors\":\"Ramon Robloke, B. Kijsirikul\",\"doi\":\"10.1145/3373477.3373704\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A task-oriented dialogue bot helps users achieve a predefined goal within a closed domain. A neural-network based dialogue bot tracks the user intention in each action, which can reach promising performance compared to a hand-crafted baseline [1] and has a more flexible conversational flow. One such end-to-end architecture is the Hybrid Code Networks (HCNs) [2]. It uses the simulated conversation of human-bot in the domain of restaurant booking to train an LSTM to track dialogue states and predict the next bot response. This research proposes a similar architecture to HCNs with the addition of attention to LSTM [3]. The best results are obtained by our model on both original and Thai translated versions of bAbI task 5.\",\"PeriodicalId\":300431,\"journal\":{\"name\":\"Proceedings of the 1st International Conference on Advanced Information Science and System\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 1st International Conference on Advanced Information Science and System\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3373477.3373704\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 1st International Conference on Advanced Information Science and System","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3373477.3373704","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A task-oriented dialogue bot using long short-term memory with attention for Thai language
A task-oriented dialogue bot helps users achieve a predefined goal within a closed domain. A neural-network based dialogue bot tracks the user intention in each action, which can reach promising performance compared to a hand-crafted baseline [1] and has a more flexible conversational flow. One such end-to-end architecture is the Hybrid Code Networks (HCNs) [2]. It uses the simulated conversation of human-bot in the domain of restaurant booking to train an LSTM to track dialogue states and predict the next bot response. This research proposes a similar architecture to HCNs with the addition of attention to LSTM [3]. The best results are obtained by our model on both original and Thai translated versions of bAbI task 5.