Attention Based LSTM with Multi Tasks Learning for Predictive Process Monitoring

T. Hnin, Khine Khine Oo
{"title":"Attention Based LSTM with Multi Tasks Learning for Predictive Process Monitoring","authors":"T. Hnin, Khine Khine Oo","doi":"10.18178/wcse.2019.03.028","DOIUrl":null,"url":null,"abstract":"Today, in the artificial intelligence research field, Deep Learning (DL) is one of the fastestgrowing techniques because of the power of learn ing features, that gives a higher level of abstraction of the raw attributes; and related research in Recurrent Neural Networks (RNN) and Long Short -Term Memory Networks (LSTM) have shown exemplary results in neural machine translation, neural image caption generation, NLP and so on. For our research, in order to detect potential problems and to facilitate proactive management, we focus on predictive process monitoring (PPM) as domain area, by predict ing business behaviour from h istorical event logs. Recent research works, LSTM networks have gained attention in PPM and have been proved that they can highly improve prediction accuracy in PPM. According to the literature, we have learned that PPM resembles to an early sequence classificat ion problem in NLP. And, recent trends in DL based NLP, attention mechanism is mostly embedded in neural networks. Inspired by these results, this paper proposes to firstly use Attention Based LSTM with Multi Tasks Learning for PPM.","PeriodicalId":342228,"journal":{"name":"Proceedings of 2019 the 9th International Workshop on Computer Science and Engineering","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 2019 the 9th International Workshop on Computer Science and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18178/wcse.2019.03.028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Today, in the artificial intelligence research field, Deep Learning (DL) is one of the fastestgrowing techniques because of the power of learn ing features, that gives a higher level of abstraction of the raw attributes; and related research in Recurrent Neural Networks (RNN) and Long Short -Term Memory Networks (LSTM) have shown exemplary results in neural machine translation, neural image caption generation, NLP and so on. For our research, in order to detect potential problems and to facilitate proactive management, we focus on predictive process monitoring (PPM) as domain area, by predict ing business behaviour from h istorical event logs. Recent research works, LSTM networks have gained attention in PPM and have been proved that they can highly improve prediction accuracy in PPM. According to the literature, we have learned that PPM resembles to an early sequence classificat ion problem in NLP. And, recent trends in DL based NLP, attention mechanism is mostly embedded in neural networks. Inspired by these results, this paper proposes to firstly use Attention Based LSTM with Multi Tasks Learning for PPM.
基于多任务学习的LSTM预测过程监控
今天,在人工智能研究领域,深度学习(DL)是发展最快的技术之一,因为它具有学习特征的能力,可以对原始属性进行更高层次的抽象;循环神经网络(RNN)和长短期记忆网络(LSTM)的相关研究在神经机器翻译、神经图像标题生成、自然语言处理等方面都取得了典型的成果。在我们的研究中,为了检测潜在的问题并促进主动管理,我们通过从历史事件日志中预测业务行为,将预测性流程监控(PPM)作为域重点。近年来的研究工作表明,LSTM网络在PPM中得到了广泛的关注,并被证明可以大大提高PPM的预测精度。根据文献,我们已经了解到PPM类似于NLP中的早期序列分类问题。在基于深度学习的自然语言处理中,注意力机制主要嵌入在神经网络中。受这些结果的启发,本文首先提出了将基于注意力的LSTM与多任务学习相结合用于PPM。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信