{"title":"振动光谱学容易受到恶意攻击。","authors":"Jinchao Liu, Margarita Osadchy, Yan Wang, Yingying Wu, Enyi Li, Xiaolin Hu, Yongchun Fang","doi":"10.1021/acs.analchem.4c02380","DOIUrl":null,"url":null,"abstract":"<p><p>Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.</p>","PeriodicalId":27,"journal":{"name":"Analytical Chemistry","volume":null,"pages":null},"PeriodicalIF":6.7000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Vibrational Spectroscopy Can Be Vulnerable to Adversarial Attacks.\",\"authors\":\"Jinchao Liu, Margarita Osadchy, Yan Wang, Yingying Wu, Enyi Li, Xiaolin Hu, Yongchun Fang\",\"doi\":\"10.1021/acs.analchem.4c02380\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.</p>\",\"PeriodicalId\":27,\"journal\":{\"name\":\"Analytical Chemistry\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.7000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Analytical Chemistry\",\"FirstCategoryId\":\"92\",\"ListUrlMain\":\"https://doi.org/10.1021/acs.analchem.4c02380\",\"RegionNum\":1,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CHEMISTRY, ANALYTICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Analytical Chemistry","FirstCategoryId":"92","ListUrlMain":"https://doi.org/10.1021/acs.analchem.4c02380","RegionNum":1,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
Vibrational Spectroscopy Can Be Vulnerable to Adversarial Attacks.
Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.
期刊介绍:
Analytical Chemistry, a peer-reviewed research journal, focuses on disseminating new and original knowledge across all branches of analytical chemistry. Fundamental articles may explore general principles of chemical measurement science and need not directly address existing or potential analytical methodology. They can be entirely theoretical or report experimental results. Contributions may cover various phases of analytical operations, including sampling, bioanalysis, electrochemistry, mass spectrometry, microscale and nanoscale systems, environmental analysis, separations, spectroscopy, chemical reactions and selectivity, instrumentation, imaging, surface analysis, and data processing. Papers discussing known analytical methods should present a significant, original application of the method, a notable improvement, or results on an important analyte.