MagicWrite:基于一维声学跟踪的空气书写系统

IF 7.7 2区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Hao Pan;Yongjian Fu;Ye Qi;Yi-Chao Chen;Ju Ren
{"title":"MagicWrite:基于一维声学跟踪的空气书写系统","authors":"Hao Pan;Yongjian Fu;Ye Qi;Yi-Chao Chen;Ju Ren","doi":"10.1109/TMC.2025.3526185","DOIUrl":null,"url":null,"abstract":"Air writing technology enhances text input for IoT, VR, and AR devices, offering a spatially flexible alternative to physical keyboards. Addressing the demand for such innovation, this paper presents MagicWrite, a novel system utilizing acoustic-based 1D tracking, which is suitable for mobile devices with existing speaker and microphone infrastructure. Compared to 2D or 3D tracking of the finger, 1D tracking eliminates the need for multiple microphones and/or speakers and is more universally applicable. However, challenges emerge when using 1D tracking for recognizing handwritten letters due to trajectory loss and inter-user writing variability. To address this, we develop a general conversion technique that transforms image-based text datasets (<italic>e.g.</i>, MNIST) into 1D tracking trajectory data, generating artificial datasets of tracking traces (referred to as <italic>Track</i>MNISTs) to bolster system robustness and scalability. These tracking datasets facilitate the creation of personalized user databases that align with individual writing habits. Combined with a kNN classifier, our proposed MagicWrite ensures high accuracy and robustness in text input recognition while simultaneously reducing computational load and energy consumption. Extensive experiments validate that our proposed MagicWrite achieves exceptional classification accuracy for unseen users and inputs in five languages, marking it as a robust solution for air writing.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"24 5","pages":"4403-4418"},"PeriodicalIF":7.7000,"publicationDate":"2025-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MagicWrite: One-Dimensional Acoustic Tracking-Based Air Writing System\",\"authors\":\"Hao Pan;Yongjian Fu;Ye Qi;Yi-Chao Chen;Ju Ren\",\"doi\":\"10.1109/TMC.2025.3526185\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Air writing technology enhances text input for IoT, VR, and AR devices, offering a spatially flexible alternative to physical keyboards. Addressing the demand for such innovation, this paper presents MagicWrite, a novel system utilizing acoustic-based 1D tracking, which is suitable for mobile devices with existing speaker and microphone infrastructure. Compared to 2D or 3D tracking of the finger, 1D tracking eliminates the need for multiple microphones and/or speakers and is more universally applicable. However, challenges emerge when using 1D tracking for recognizing handwritten letters due to trajectory loss and inter-user writing variability. To address this, we develop a general conversion technique that transforms image-based text datasets (<italic>e.g.</i>, MNIST) into 1D tracking trajectory data, generating artificial datasets of tracking traces (referred to as <italic>Track</i>MNISTs) to bolster system robustness and scalability. These tracking datasets facilitate the creation of personalized user databases that align with individual writing habits. Combined with a kNN classifier, our proposed MagicWrite ensures high accuracy and robustness in text input recognition while simultaneously reducing computational load and energy consumption. Extensive experiments validate that our proposed MagicWrite achieves exceptional classification accuracy for unseen users and inputs in five languages, marking it as a robust solution for air writing.\",\"PeriodicalId\":50389,\"journal\":{\"name\":\"IEEE Transactions on Mobile Computing\",\"volume\":\"24 5\",\"pages\":\"4403-4418\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2025-01-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Mobile Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10829791/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10829791/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

空中书写技术增强了物联网、虚拟现实和增强现实设备的文本输入,为物理键盘提供了空间灵活的替代方案。针对这种创新的需求,本文提出了MagicWrite,这是一种利用基于声学的1D跟踪的新系统,适用于具有现有扬声器和麦克风基础设施的移动设备。与手指的2D或3D跟踪相比,1D跟踪消除了对多个麦克风和/或扬声器的需求,并且更普遍适用。然而,由于轨迹丢失和用户间书写的可变性,使用1D跟踪识别手写字母时出现了挑战。为了解决这个问题,我们开发了一种通用的转换技术,将基于图像的文本数据集(例如,MNIST)转换为1D跟踪轨迹数据,生成跟踪轨迹的人工数据集(称为trackmnist),以增强系统的鲁棒性和可扩展性。这些跟踪数据集有助于创建符合个人写作习惯的个性化用户数据库。结合kNN分类器,我们提出的MagicWrite在文本输入识别中确保了高准确性和鲁棒性,同时减少了计算负荷和能耗。大量的实验验证了我们提出的MagicWrite对未见过的用户和五种语言的输入实现了卓越的分类准确性,标志着它是空中书写的强大解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
MagicWrite: One-Dimensional Acoustic Tracking-Based Air Writing System
Air writing technology enhances text input for IoT, VR, and AR devices, offering a spatially flexible alternative to physical keyboards. Addressing the demand for such innovation, this paper presents MagicWrite, a novel system utilizing acoustic-based 1D tracking, which is suitable for mobile devices with existing speaker and microphone infrastructure. Compared to 2D or 3D tracking of the finger, 1D tracking eliminates the need for multiple microphones and/or speakers and is more universally applicable. However, challenges emerge when using 1D tracking for recognizing handwritten letters due to trajectory loss and inter-user writing variability. To address this, we develop a general conversion technique that transforms image-based text datasets (e.g., MNIST) into 1D tracking trajectory data, generating artificial datasets of tracking traces (referred to as TrackMNISTs) to bolster system robustness and scalability. These tracking datasets facilitate the creation of personalized user databases that align with individual writing habits. Combined with a kNN classifier, our proposed MagicWrite ensures high accuracy and robustness in text input recognition while simultaneously reducing computational load and energy consumption. Extensive experiments validate that our proposed MagicWrite achieves exceptional classification accuracy for unseen users and inputs in five languages, marking it as a robust solution for air writing.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Mobile Computing
IEEE Transactions on Mobile Computing 工程技术-电信学
CiteScore
12.90
自引率
2.50%
发文量
403
审稿时长
6.6 months
期刊介绍: IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信