Guoming Zhang;Heqiang Fu;Zhijie Xiang;Xinyan Zhou;Pengfei Hu;Xiuzhen Cheng;Yanni Yang
{"title":"利用 cGAN 增强基于环境光反射的窃听功能","authors":"Guoming Zhang;Heqiang Fu;Zhijie Xiang;Xinyan Zhou;Pengfei Hu;Xiuzhen Cheng;Yanni Yang","doi":"10.1109/TMC.2024.3460392","DOIUrl":null,"url":null,"abstract":"Sound eavesdropping using light has been an area of considerable interest and concern, as it can be achieved over long distances. However, previous work has often lacked stealth (e.g., active emission of laser beams) or been limited in the range of realistic applications (e.g., using direct light from a device’s indicator LED or a hanging light bulb). In this paper, we present \n<monospace>EchoLight</monospace>\n, a non-intrusive, passive and long-range sound eavesdropping method that utilizes the extensive reflection of ambient light from vibrating objects to reconstruct sound. We analyze the relationship between reflection light signals and sound signals, particularly in situations where the frequency response of reflective objects and the efficiency of diffuse reflection are suboptimal. Based on this analysis, we have introduced an algorithm based on cGAN to address the issues of nonlinear distortion and spectral absence in the frequency domain of sound. We extensively evaluate \n<monospace>EchoLight</monospace>\n’s performance in a variety of real-world scenarios. It demonstrates the ability to accurately reconstruct audio from a variety of source distances, attack distances, sound levels, light sources, and reflective materials. Our results reveal that the reconstructed audio exhibits a high degree of similarity to the original audio over 40 meters of attack distance.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"24 1","pages":"72-85"},"PeriodicalIF":7.7000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Ambient Light Reflection-Based Eavesdropping Enhanced With cGAN\",\"authors\":\"Guoming Zhang;Heqiang Fu;Zhijie Xiang;Xinyan Zhou;Pengfei Hu;Xiuzhen Cheng;Yanni Yang\",\"doi\":\"10.1109/TMC.2024.3460392\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sound eavesdropping using light has been an area of considerable interest and concern, as it can be achieved over long distances. However, previous work has often lacked stealth (e.g., active emission of laser beams) or been limited in the range of realistic applications (e.g., using direct light from a device’s indicator LED or a hanging light bulb). In this paper, we present \\n<monospace>EchoLight</monospace>\\n, a non-intrusive, passive and long-range sound eavesdropping method that utilizes the extensive reflection of ambient light from vibrating objects to reconstruct sound. We analyze the relationship between reflection light signals and sound signals, particularly in situations where the frequency response of reflective objects and the efficiency of diffuse reflection are suboptimal. Based on this analysis, we have introduced an algorithm based on cGAN to address the issues of nonlinear distortion and spectral absence in the frequency domain of sound. We extensively evaluate \\n<monospace>EchoLight</monospace>\\n’s performance in a variety of real-world scenarios. It demonstrates the ability to accurately reconstruct audio from a variety of source distances, attack distances, sound levels, light sources, and reflective materials. Our results reveal that the reconstructed audio exhibits a high degree of similarity to the original audio over 40 meters of attack distance.\",\"PeriodicalId\":50389,\"journal\":{\"name\":\"IEEE Transactions on Mobile Computing\",\"volume\":\"24 1\",\"pages\":\"72-85\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Mobile Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10679898/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10679898/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Ambient Light Reflection-Based Eavesdropping Enhanced With cGAN
Sound eavesdropping using light has been an area of considerable interest and concern, as it can be achieved over long distances. However, previous work has often lacked stealth (e.g., active emission of laser beams) or been limited in the range of realistic applications (e.g., using direct light from a device’s indicator LED or a hanging light bulb). In this paper, we present
EchoLight
, a non-intrusive, passive and long-range sound eavesdropping method that utilizes the extensive reflection of ambient light from vibrating objects to reconstruct sound. We analyze the relationship between reflection light signals and sound signals, particularly in situations where the frequency response of reflective objects and the efficiency of diffuse reflection are suboptimal. Based on this analysis, we have introduced an algorithm based on cGAN to address the issues of nonlinear distortion and spectral absence in the frequency domain of sound. We extensively evaluate
EchoLight
’s performance in a variety of real-world scenarios. It demonstrates the ability to accurately reconstruct audio from a variety of source distances, attack distances, sound levels, light sources, and reflective materials. Our results reveal that the reconstructed audio exhibits a high degree of similarity to the original audio over 40 meters of attack distance.
期刊介绍:
IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.