基于rnn的房间尺度手部运动跟踪

Wenguang Mao, Mei Wang, Wei Sun, L. Qiu, S. Pradhan, Yi-Chao Chen
{"title":"基于rnn的房间尺度手部运动跟踪","authors":"Wenguang Mao, Mei Wang, Wei Sun, L. Qiu, S. Pradhan, Yi-Chao Chen","doi":"10.1145/3300061.3345439","DOIUrl":null,"url":null,"abstract":"Smart speakers allow users to interact with home appliances using voice commands and are becoming increasingly popular. While voice-based interface is intuitive, it is insufficient in many scenarios, such as in noisy or quiet environments, for users with language barriers, or in applications that require continuous motion tracking. Motion-based control is attractive and complementary to existing voice-based control. However, accurate and reliable room-scale motion tracking poses a significant challenge due to low SNR, interference, and varying mobility. To this end, we develop a novel recurrent neural network (RNN) based system that uses speakers and microphones to realize accurate room-scale tracking. Our system jointly estimates the propagation distance and angle-of-arrival (AoA) of signals reflected by the hand, based on AoA-distance profiles generated by 2D MUSIC. We design a series of techniques to significantly enhance the profile quality under low SNR. We feed the profiles in a recent history to our RNN to estimate the distance and AoA. In this way, we can exploit the temporal structure among consecutive profiles to remove the impact of noise, interference and mobility. Using extensive evaluation, we show our system achieves 1.2--3.7~cm error within 4.5~m range, supports tracking multiple users, and is robust against ambient sound. To our knowledge, this is the first acoustic device-free room-scale tracking system.","PeriodicalId":223523,"journal":{"name":"The 25th Annual International Conference on Mobile Computing and Networking","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"59","resultStr":"{\"title\":\"RNN-Based Room Scale Hand Motion Tracking\",\"authors\":\"Wenguang Mao, Mei Wang, Wei Sun, L. Qiu, S. Pradhan, Yi-Chao Chen\",\"doi\":\"10.1145/3300061.3345439\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Smart speakers allow users to interact with home appliances using voice commands and are becoming increasingly popular. While voice-based interface is intuitive, it is insufficient in many scenarios, such as in noisy or quiet environments, for users with language barriers, or in applications that require continuous motion tracking. Motion-based control is attractive and complementary to existing voice-based control. However, accurate and reliable room-scale motion tracking poses a significant challenge due to low SNR, interference, and varying mobility. To this end, we develop a novel recurrent neural network (RNN) based system that uses speakers and microphones to realize accurate room-scale tracking. Our system jointly estimates the propagation distance and angle-of-arrival (AoA) of signals reflected by the hand, based on AoA-distance profiles generated by 2D MUSIC. We design a series of techniques to significantly enhance the profile quality under low SNR. We feed the profiles in a recent history to our RNN to estimate the distance and AoA. In this way, we can exploit the temporal structure among consecutive profiles to remove the impact of noise, interference and mobility. Using extensive evaluation, we show our system achieves 1.2--3.7~cm error within 4.5~m range, supports tracking multiple users, and is robust against ambient sound. To our knowledge, this is the first acoustic device-free room-scale tracking system.\",\"PeriodicalId\":223523,\"journal\":{\"name\":\"The 25th Annual International Conference on Mobile Computing and Networking\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"59\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 25th Annual International Conference on Mobile Computing and Networking\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3300061.3345439\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 25th Annual International Conference on Mobile Computing and Networking","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3300061.3345439","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 59

摘要

智能扬声器允许用户使用语音命令与家用电器进行交互,并且正变得越来越流行。虽然基于语音的界面直观,但在许多情况下,例如在嘈杂或安静的环境中,对于有语言障碍的用户,或者需要连续运动跟踪的应用程序,它是不够的。基于动作的控制对现有的基于语音的控制具有吸引力和补充性。然而,由于低信噪比、干扰和移动变化,准确可靠的房间尺度运动跟踪提出了重大挑战。为此,我们开发了一种新的基于循环神经网络(RNN)的系统,该系统使用扬声器和麦克风来实现精确的房间尺度跟踪。我们的系统基于2D MUSIC生成的到达角(AoA)距离曲线,共同估计手部反射信号的传播距离和到达角(AoA)。我们设计了一系列技术来显著提高低信噪比下的剖面质量。我们将最近历史的轮廓输入到我们的RNN中来估计距离和AoA。这样,我们可以利用连续剖面之间的时间结构来消除噪声、干扰和移动的影响。通过广泛的评估,我们表明我们的系统在4.5~m范围内达到1.2 ~ 3.7~cm的误差,支持跟踪多个用户,并且对环境声音具有鲁棒性。据我们所知,这是第一个无声学设备的房间级跟踪系统。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
RNN-Based Room Scale Hand Motion Tracking
Smart speakers allow users to interact with home appliances using voice commands and are becoming increasingly popular. While voice-based interface is intuitive, it is insufficient in many scenarios, such as in noisy or quiet environments, for users with language barriers, or in applications that require continuous motion tracking. Motion-based control is attractive and complementary to existing voice-based control. However, accurate and reliable room-scale motion tracking poses a significant challenge due to low SNR, interference, and varying mobility. To this end, we develop a novel recurrent neural network (RNN) based system that uses speakers and microphones to realize accurate room-scale tracking. Our system jointly estimates the propagation distance and angle-of-arrival (AoA) of signals reflected by the hand, based on AoA-distance profiles generated by 2D MUSIC. We design a series of techniques to significantly enhance the profile quality under low SNR. We feed the profiles in a recent history to our RNN to estimate the distance and AoA. In this way, we can exploit the temporal structure among consecutive profiles to remove the impact of noise, interference and mobility. Using extensive evaluation, we show our system achieves 1.2--3.7~cm error within 4.5~m range, supports tracking multiple users, and is robust against ambient sound. To our knowledge, this is the first acoustic device-free room-scale tracking system.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信