Lei Gao;Liyuan Cui;Shuwen Chen;Lizhen Deng;Xiaokang Wang;Xiaohong Yan;Hu Zhu
{"title":"红外光谱反褶积自相关多头注意转换器","authors":"Lei Gao;Liyuan Cui;Shuwen Chen;Lizhen Deng;Xiaokang Wang;Xiaohong Yan;Hu Zhu","doi":"10.26599/TST.2024.9010131","DOIUrl":null,"url":null,"abstract":"Infrared spectroscopy analysis has found widespread applications in various fields due to advancements in technology and industry convergence. To improve the quality and reliability of infrared spectroscopy signals, deconvolution is a crucial preprocessing step. Inspired by the transformer model, we propose an Auto-correlation Multi-head attention Transformer (AMTrans) for infrared spectrum sequence deconvolution. The auto-correlation attention model improves the scaled dot-product attention in the transformer. It utilizes attention mechanism for feature extraction and implements attention computation using the auto-correlation function. The auto-correlation attention model is used to exploit the inherent sequence nature of spectral data and to effectively recovery spectra by capturing auto-correlation patterns in the sequence. The proposed model is trained using supervised learning and demonstrates promising results in infrared spectroscopic restoration. By comparing the experiments with other deconvolution techniques, the experimental results show that the method has excellent deconvolution performance and can effectively recover the texture details of the infrared spectrum.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"30 3","pages":"1329-1341"},"PeriodicalIF":6.6000,"publicationDate":"2024-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10817762","citationCount":"0","resultStr":"{\"title\":\"AMTrans: Auto-Correlation Multi-Head Attention Transformer for Infrared Spectral Deconvolution\",\"authors\":\"Lei Gao;Liyuan Cui;Shuwen Chen;Lizhen Deng;Xiaokang Wang;Xiaohong Yan;Hu Zhu\",\"doi\":\"10.26599/TST.2024.9010131\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Infrared spectroscopy analysis has found widespread applications in various fields due to advancements in technology and industry convergence. To improve the quality and reliability of infrared spectroscopy signals, deconvolution is a crucial preprocessing step. Inspired by the transformer model, we propose an Auto-correlation Multi-head attention Transformer (AMTrans) for infrared spectrum sequence deconvolution. The auto-correlation attention model improves the scaled dot-product attention in the transformer. It utilizes attention mechanism for feature extraction and implements attention computation using the auto-correlation function. The auto-correlation attention model is used to exploit the inherent sequence nature of spectral data and to effectively recovery spectra by capturing auto-correlation patterns in the sequence. The proposed model is trained using supervised learning and demonstrates promising results in infrared spectroscopic restoration. By comparing the experiments with other deconvolution techniques, the experimental results show that the method has excellent deconvolution performance and can effectively recover the texture details of the infrared spectrum.\",\"PeriodicalId\":48690,\"journal\":{\"name\":\"Tsinghua Science and Technology\",\"volume\":\"30 3\",\"pages\":\"1329-1341\"},\"PeriodicalIF\":6.6000,\"publicationDate\":\"2024-12-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10817762\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Tsinghua Science and Technology\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10817762/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Multidisciplinary\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10817762/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
AMTrans: Auto-Correlation Multi-Head Attention Transformer for Infrared Spectral Deconvolution
Infrared spectroscopy analysis has found widespread applications in various fields due to advancements in technology and industry convergence. To improve the quality and reliability of infrared spectroscopy signals, deconvolution is a crucial preprocessing step. Inspired by the transformer model, we propose an Auto-correlation Multi-head attention Transformer (AMTrans) for infrared spectrum sequence deconvolution. The auto-correlation attention model improves the scaled dot-product attention in the transformer. It utilizes attention mechanism for feature extraction and implements attention computation using the auto-correlation function. The auto-correlation attention model is used to exploit the inherent sequence nature of spectral data and to effectively recovery spectra by capturing auto-correlation patterns in the sequence. The proposed model is trained using supervised learning and demonstrates promising results in infrared spectroscopic restoration. By comparing the experiments with other deconvolution techniques, the experimental results show that the method has excellent deconvolution performance and can effectively recover the texture details of the infrared spectrum.
期刊介绍:
Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.