预测性业务流程监控中的注意机制

Abdulrahman Jalayer, M. Kahani, A. Beheshti, Asef Pourmasoumi, H. M. Nezhad
{"title":"预测性业务流程监控中的注意机制","authors":"Abdulrahman Jalayer, M. Kahani, A. Beheshti, Asef Pourmasoumi, H. M. Nezhad","doi":"10.1109/EDOC49727.2020.00030","DOIUrl":null,"url":null,"abstract":"Business process monitoring techniques have been investigated in depth over the last decade to enable organizations to deliver process insight. Recently, a new stream of work in predictive business process monitoring leveraged deep learning techniques to unlock the potential business value locked in process execution event logs. These works use Recurrent Neural Networks, such as Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), and suffer from misinformation and accuracy as they use the last hidden state (as the context vector) for the purpose of predicting the next event. On the other hand, in operational processes, traces may be very long, which makes the above methods inappropriate for analyzing them. In addition, in predicting the next events in a running case, some of the previous events should be given a higher priority. To address these shortcomings, in this paper, we present a novel approach inspired by the notion of attention mechanism, utilized in Natural Language Processing and, particularly, in Neural Machine Translation. Our proposed approach uses all hidden states to accurately predict future behavior and the outcome of individual activities. Experimental evaluation of real-world event logs revealed that the use of attention mechanisms in the proposed approach leads to a more accurate prediction.","PeriodicalId":409420,"journal":{"name":"2020 IEEE 24th International Enterprise Distributed Object Computing Conference (EDOC)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Attention Mechanism in Predictive Business Process Monitoring\",\"authors\":\"Abdulrahman Jalayer, M. Kahani, A. Beheshti, Asef Pourmasoumi, H. M. Nezhad\",\"doi\":\"10.1109/EDOC49727.2020.00030\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Business process monitoring techniques have been investigated in depth over the last decade to enable organizations to deliver process insight. Recently, a new stream of work in predictive business process monitoring leveraged deep learning techniques to unlock the potential business value locked in process execution event logs. These works use Recurrent Neural Networks, such as Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), and suffer from misinformation and accuracy as they use the last hidden state (as the context vector) for the purpose of predicting the next event. On the other hand, in operational processes, traces may be very long, which makes the above methods inappropriate for analyzing them. In addition, in predicting the next events in a running case, some of the previous events should be given a higher priority. To address these shortcomings, in this paper, we present a novel approach inspired by the notion of attention mechanism, utilized in Natural Language Processing and, particularly, in Neural Machine Translation. Our proposed approach uses all hidden states to accurately predict future behavior and the outcome of individual activities. Experimental evaluation of real-world event logs revealed that the use of attention mechanisms in the proposed approach leads to a more accurate prediction.\",\"PeriodicalId\":409420,\"journal\":{\"name\":\"2020 IEEE 24th International Enterprise Distributed Object Computing Conference (EDOC)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 24th International Enterprise Distributed Object Computing Conference (EDOC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EDOC49727.2020.00030\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 24th International Enterprise Distributed Object Computing Conference (EDOC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EDOC49727.2020.00030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

摘要

业务流程监控技术在过去十年中得到了深入的研究,以使组织能够交付流程洞察力。最近,预测业务流程监控的新工作流利用深度学习技术来解锁锁定在流程执行事件日志中的潜在业务价值。这些工作使用循环神经网络,如长短期记忆(LSTM)和门控循环单元(GRU),并且由于使用最后一个隐藏状态(作为上下文向量)来预测下一个事件而遭受错误信息和准确性的影响。另一方面,在操作过程中,跟踪可能很长,这使得上述方法不适合分析它们。此外,在预测运行情况下的下一个事件时,应该给予先前的一些事件更高的优先级。为了解决这些缺点,在本文中,我们提出了一种受注意机制概念启发的新方法,用于自然语言处理,特别是神经机器翻译。我们提出的方法使用所有隐藏状态来准确预测未来行为和个体活动的结果。对现实世界事件日志的实验评估表明,在所提出的方法中使用注意机制可以导致更准确的预测。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Attention Mechanism in Predictive Business Process Monitoring
Business process monitoring techniques have been investigated in depth over the last decade to enable organizations to deliver process insight. Recently, a new stream of work in predictive business process monitoring leveraged deep learning techniques to unlock the potential business value locked in process execution event logs. These works use Recurrent Neural Networks, such as Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), and suffer from misinformation and accuracy as they use the last hidden state (as the context vector) for the purpose of predicting the next event. On the other hand, in operational processes, traces may be very long, which makes the above methods inappropriate for analyzing them. In addition, in predicting the next events in a running case, some of the previous events should be given a higher priority. To address these shortcomings, in this paper, we present a novel approach inspired by the notion of attention mechanism, utilized in Natural Language Processing and, particularly, in Neural Machine Translation. Our proposed approach uses all hidden states to accurately predict future behavior and the outcome of individual activities. Experimental evaluation of real-world event logs revealed that the use of attention mechanisms in the proposed approach leads to a more accurate prediction.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信