AttDCT: Attention-Based Deep Learning Approach for Time Series Classification in the DCT Domain

Amine Haboub;Hamza Baali;Abdesselam Bouzerdoum
{"title":"AttDCT: Attention-Based Deep Learning Approach for Time Series Classification in the DCT Domain","authors":"Amine Haboub;Hamza Baali;Abdesselam Bouzerdoum","doi":"10.1109/TAI.2025.3534141","DOIUrl":null,"url":null,"abstract":"This article proposes a new deep learning framework for time series classification in the discrete cosine transform (DCT) domain with spectral enhancement and self-attention mechanisms. The time series signal is first partitioned into discrete segments. Each segment is rearranged into a matrix using a sliding window. The signal matrix is then transformed to spectral coefficients using a two-dimensional (2-D) DCT. This is followed by logarithmic contrast enhancement and spectral normalization to enhance the DCT coefficients. The resulting enhanced coefficient matrix serves as input to a deep neural network architecture comprising a self-attention layer, a multilayer convolutional neural network (CNN), and a fully connected multilayer perceptron (MLP) for classification. The AttDCT CNN model is evaluated and benchmarked on 13 different time series classification problems. The experimental results show that the proposed model outperforms state-of-the-art deep learning methods by an average of 2.1% in classification accuracy. It achieves higher classification accuracy on ten of the problems and similar results on the remaining three.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"6 6","pages":"1626-1638"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10855682","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10855682/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This article proposes a new deep learning framework for time series classification in the discrete cosine transform (DCT) domain with spectral enhancement and self-attention mechanisms. The time series signal is first partitioned into discrete segments. Each segment is rearranged into a matrix using a sliding window. The signal matrix is then transformed to spectral coefficients using a two-dimensional (2-D) DCT. This is followed by logarithmic contrast enhancement and spectral normalization to enhance the DCT coefficients. The resulting enhanced coefficient matrix serves as input to a deep neural network architecture comprising a self-attention layer, a multilayer convolutional neural network (CNN), and a fully connected multilayer perceptron (MLP) for classification. The AttDCT CNN model is evaluated and benchmarked on 13 different time series classification problems. The experimental results show that the proposed model outperforms state-of-the-art deep learning methods by an average of 2.1% in classification accuracy. It achieves higher classification accuracy on ten of the problems and similar results on the remaining three.
基于注意力的深度学习方法在DCT领域的时间序列分类
本文提出了一种新的基于谱增强和自关注机制的离散余弦变换(DCT)域时间序列分类深度学习框架。首先将时间序列信号分割成离散的片段。使用滑动窗口将每个段重新排列成一个矩阵。然后使用二维(2-D) DCT将信号矩阵转换为频谱系数。接下来是对数对比度增强和光谱归一化,以增强DCT系数。得到的增强系数矩阵作为深度神经网络架构的输入,该架构包括自注意层、多层卷积神经网络(CNN)和用于分类的全连接多层感知器(MLP)。在13个不同的时间序列分类问题上对AttDCT CNN模型进行了评估和基准测试。实验结果表明,该模型的分类准确率比目前最先进的深度学习方法平均高出2.1%。它在其中的10个问题上达到了更高的分类精度,在其余3个问题上达到了类似的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信