Hawkes Process with Flexible Triggering Kernels.

Yamac Isik, Paidamoyo Chapfuwa, Connor Davis, Ricardo Henao
{"title":"Hawkes Process with Flexible Triggering Kernels.","authors":"Yamac Isik, Paidamoyo Chapfuwa, Connor Davis, Ricardo Henao","doi":"","DOIUrl":null,"url":null,"abstract":"<p><p>Recently proposed encoder-decoder structures for modeling Hawkes processes use transformer-inspired architectures, which encode the history of events via embeddings and self-attention mechanisms. These models deliver better prediction and goodness-of-fit than their RNN-based counterparts. However, they often require high computational and memory complexity and fail to adequately capture the triggering function of the underlying process. So motivated, we introduce an efficient and general encoding of the historical event sequence by replacing the complex (multilayered) attention structures with triggering kernels of the observed data. Noting the similarity between the triggering kernels of a point process and the attention scores, we use a triggering kernel to replace the weights used to build history representations. Our estimator for the triggering function is equipped with a sigmoid gating mechanism that captures local-in-time triggering effects that are otherwise challenging with standard decaying-over-time kernels. Further, taking both event type representations and temporal embeddings as inputs, the model learns the underlying triggering type-time kernel parameters given pairs of event types. We present experiments on synthetic and real data sets widely used by competing models, and further include a COVID-19 dataset to illustrate the use of longitudinal covariates. Our results show the proposed model outperforms existing approaches, is more efficient in terms of computational complexity, and yields interpretable results via direct application of the newly introduced kernel.</p>","PeriodicalId":74504,"journal":{"name":"Proceedings of machine learning research","volume":"219 ","pages":"308-320"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12443382/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of machine learning research","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recently proposed encoder-decoder structures for modeling Hawkes processes use transformer-inspired architectures, which encode the history of events via embeddings and self-attention mechanisms. These models deliver better prediction and goodness-of-fit than their RNN-based counterparts. However, they often require high computational and memory complexity and fail to adequately capture the triggering function of the underlying process. So motivated, we introduce an efficient and general encoding of the historical event sequence by replacing the complex (multilayered) attention structures with triggering kernels of the observed data. Noting the similarity between the triggering kernels of a point process and the attention scores, we use a triggering kernel to replace the weights used to build history representations. Our estimator for the triggering function is equipped with a sigmoid gating mechanism that captures local-in-time triggering effects that are otherwise challenging with standard decaying-over-time kernels. Further, taking both event type representations and temporal embeddings as inputs, the model learns the underlying triggering type-time kernel parameters given pairs of event types. We present experiments on synthetic and real data sets widely used by competing models, and further include a COVID-19 dataset to illustrate the use of longitudinal covariates. Our results show the proposed model outperforms existing approaches, is more efficient in terms of computational complexity, and yields interpretable results via direct application of the newly introduced kernel.

具有柔性触发核的Hawkes过程。
最近提出的用于建模Hawkes过程的编码器-解码器结构使用了受变压器启发的架构,该架构通过嵌入和自关注机制对事件的历史进行编码。这些模型比基于rnn的模型提供更好的预测和拟合优度。然而,它们通常需要很高的计算和内存复杂性,并且不能充分捕获底层进程的触发功能。因此,我们引入了一种高效和通用的历史事件序列编码,用观测数据的触发核取代复杂的(多层)注意力结构。注意到点过程的触发核与注意分数之间的相似性,我们使用触发核来替换用于构建历史表示的权重。我们的触发函数估计器配备了一个sigmoid门控机制,该机制可以捕获本地实时触发效应,否则标准的随时间衰减核就会面临挑战。此外,将事件类型表示和时间嵌入作为输入,该模型学习给定事件类型对的底层触发类型-时间内核参数。我们在竞争模型广泛使用的合成数据集和真实数据集上进行了实验,并进一步包括COVID-19数据集来说明纵向协变量的使用。我们的结果表明,所提出的模型优于现有的方法,在计算复杂性方面更有效,并且通过直接应用新引入的核产生可解释的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信