Marzieh Ajirak, Immanuel Elbau, Nili Solomonov, Logan Grosenick
{"title":"Discrete Representation Learning for Multivariate Time Series.","authors":"Marzieh Ajirak, Immanuel Elbau, Nili Solomonov, Logan Grosenick","doi":"10.23919/eusipco63174.2024.10715138","DOIUrl":null,"url":null,"abstract":"<p><p>This paper focuses on discrete representation learning for multivariate time series with Gaussian processes. To overcome the challenges inherent in incorporating discrete latent variables into deep learning models, our approach uses a Gumbel-softmax reparameterization trick to address non-differentiability, enabling joint clustering and embedding through learnable discretization of the latent space. The proposed architecture thus enhances interpretability both by estimating a low-dimensional embedding for high dimensional time series and by simultaneously discovering discrete latent states. Empirical assessments on synthetic and real-world fMRI data validate the model's efficacy, showing improved classification results using our representation.</p>","PeriodicalId":87340,"journal":{"name":"Proceedings of the ... European Signal Processing Conference (EUSIPCO). EUSIPCO (Conference)","volume":"2024 ","pages":"1132-1136"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12162130/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... European Signal Processing Conference (EUSIPCO). EUSIPCO (Conference)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/eusipco63174.2024.10715138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/10/23 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper focuses on discrete representation learning for multivariate time series with Gaussian processes. To overcome the challenges inherent in incorporating discrete latent variables into deep learning models, our approach uses a Gumbel-softmax reparameterization trick to address non-differentiability, enabling joint clustering and embedding through learnable discretization of the latent space. The proposed architecture thus enhances interpretability both by estimating a low-dimensional embedding for high dimensional time series and by simultaneously discovering discrete latent states. Empirical assessments on synthetic and real-world fMRI data validate the model's efficacy, showing improved classification results using our representation.