{"title":"一种用于时间序列异常检测的扩展变压器网络","authors":"Bo Wu, Zhenjie Yao, Yanhui Tu, Yixin Chen","doi":"10.1109/ICTAI56018.2022.00016","DOIUrl":null,"url":null,"abstract":"Unsupervised anomaly detection for time series has been an active research area due to its enormous potential for wireless network management. Existing works have made extraordinary progress in time series representation, reconstruction and forecasting. However, long-term temporal patterns prohibit the model from learning reliable dependencies. To this end, we propose a novel approach based on Transformer with dilated convolution for time anomaly detection. Specifically, we provide a dilated convolution module to extract long-term dependence features. Extensive experiments on various public benchmarks demonstrate that our method has achieved the state-of-the-art performance.","PeriodicalId":354314,"journal":{"name":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Dilated Transformer Network for Time Series Anomaly Detection\",\"authors\":\"Bo Wu, Zhenjie Yao, Yanhui Tu, Yixin Chen\",\"doi\":\"10.1109/ICTAI56018.2022.00016\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Unsupervised anomaly detection for time series has been an active research area due to its enormous potential for wireless network management. Existing works have made extraordinary progress in time series representation, reconstruction and forecasting. However, long-term temporal patterns prohibit the model from learning reliable dependencies. To this end, we propose a novel approach based on Transformer with dilated convolution for time anomaly detection. Specifically, we provide a dilated convolution module to extract long-term dependence features. Extensive experiments on various public benchmarks demonstrate that our method has achieved the state-of-the-art performance.\",\"PeriodicalId\":354314,\"journal\":{\"name\":\"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTAI56018.2022.00016\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI56018.2022.00016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Dilated Transformer Network for Time Series Anomaly Detection
Unsupervised anomaly detection for time series has been an active research area due to its enormous potential for wireless network management. Existing works have made extraordinary progress in time series representation, reconstruction and forecasting. However, long-term temporal patterns prohibit the model from learning reliable dependencies. To this end, we propose a novel approach based on Transformer with dilated convolution for time anomaly detection. Specifically, we provide a dilated convolution module to extract long-term dependence features. Extensive experiments on various public benchmarks demonstrate that our method has achieved the state-of-the-art performance.