Proceedings of the 2021 ACM International Symposium on Wearable Computers最新文献

筛选
英文 中文
Where Should I Look? Comparing Reference Frames for Spatial Tactile Cues 我应该去哪里看?空间触觉线索的参照系比较
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3478822
Erik Pescara, Anton Stubenbord, Tobias Röddiger, Likun Fang, M. Beigl
{"title":"Where Should I Look? Comparing Reference Frames for Spatial Tactile Cues","authors":"Erik Pescara, Anton Stubenbord, Tobias Röddiger, Likun Fang, M. Beigl","doi":"10.1145/3460421.3478822","DOIUrl":"https://doi.org/10.1145/3460421.3478822","url":null,"abstract":"When designing tactile displays on the wrist for spatial cues, it is crucial to keep the natural movement of the body in mind. Depending on the movement of the wrist, different reference frames can influence the output of the wristband. In this paper, we compared two possible reference frames, one where spatial cues are fixed in a wrist-centered frame of reference, and an allocentric frame of reference which fixes spatial cues in the global coordinate system. We compared both conditions in terms of reaction time, achievable accuracy and cognitive load. Our study with 20 participants shows that utilizing the allocentric reference frame reduces cognitive load (avg. 38% reduction) and reaction time (avg. 240ms reduction), with no statistically significant difference in accuracy.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129303830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
VIDENS: Vision-based User Identification from Inertial Sensors VIDENS:基于惯性传感器的视觉用户识别
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3480426
Alejandro Sánchez Guinea, Simon Heinrich, Max Mühlhäuser
{"title":"VIDENS: Vision-based User Identification from Inertial Sensors","authors":"Alejandro Sánchez Guinea, Simon Heinrich, Max Mühlhäuser","doi":"10.1145/3460421.3480426","DOIUrl":"https://doi.org/10.1145/3460421.3480426","url":null,"abstract":"In this paper we propose the VIDENS (vision-based user identification from inertial sensors) approach, which transforms inertial sensors time-series data into images that represent in pixel form patterns found over time, allowing even a simple CNN to outperform complex ad-hoc deep learning models that combine RNNs and CNNs for user identification. Our evaluation shows promising results when comparing our approach to some relevant existing methods.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117153345","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Typing on Tap: Estimating a Finger-Worn One-Handed Chording Keyboard’s Text Entry Rate 敲击打字:估计手指磨损的单手和弦键盘的文本输入率
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3480428
Jason Tu, Angeline Vidhula Jeyachandra, Deepthi Nagesh, Naresh Prabhu, Thad Starner
{"title":"Typing on Tap: Estimating a Finger-Worn One-Handed Chording Keyboard’s Text Entry Rate","authors":"Jason Tu, Angeline Vidhula Jeyachandra, Deepthi Nagesh, Naresh Prabhu, Thad Starner","doi":"10.1145/3460421.3480428","DOIUrl":"https://doi.org/10.1145/3460421.3480428","url":null,"abstract":"The Tap StrapTM enables eyes-free mobile typing using a one-hand device worn on the user’s fingers. Using the standard MacKenzie-Soukoreff text entry phrase set, 12 participants completed 480 minutes of practice with the keyboard on either their dominant or non-dominant hand. Average final typing rate was 22.11 words per minute (WPM), and letter accuracy increased from 85.11% to 91.02%. Non-dominant hand users typed more accurately than dominant after 420 minutes of practice.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132553204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
MOTUS: Rendering Emotions with a Wrist-worn Tactile Display MOTUS:用手腕佩戴的触觉显示器渲染情感
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3480390
Verindi Vekemans, Ward Leenders, Sijie Zhu, Rong-Hao Liang
{"title":"MOTUS: Rendering Emotions with a Wrist-worn Tactile Display","authors":"Verindi Vekemans, Ward Leenders, Sijie Zhu, Rong-Hao Liang","doi":"10.1145/3460421.3480390","DOIUrl":"https://doi.org/10.1145/3460421.3480390","url":null,"abstract":"This paper investigates whether tactile texture patterns on the wrists can be interpreted as particular emotions. A prototype watch-back tactile display, MOTUS, was implemented to press different texture patterns in various frequencies onto a wrist to convey emotions. We conducted a preliminary guessability study with the prototype. The result reveals the wearers’ agreement in interpreting the emotional states from the tactile texture patterns.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122865358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
VäriWig: Interactive Coloring Wig Module VäriWig:交互式着色假发模块
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3478832
D. Brun, Jonna Häkkilä
{"title":"VäriWig: Interactive Coloring Wig Module","authors":"D. Brun, Jonna Häkkilä","doi":"10.1145/3460421.3478832","DOIUrl":"https://doi.org/10.1145/3460421.3478832","url":null,"abstract":"We present the design and prototype of VäriWig, an interactive coloring wig module, which allows traditional synthetic wigs to change colors depending on several factors, such as head movements, music rhythms and other external synchronizations. A large number of customized optical fibers coupled with individually controlled digital tricolor light-emitting diodes are employed to instantly change the colors of specific locks of hair placed all around the head. VäriWig was designed to be seamlessly blended with the traditional wig, even while inactive and is envisioned to be used in art shows and other entertaining social events.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130215038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
HeatSight: Wearable Low-power Omni Thermal Sensing HeatSight:可穿戴低功耗全方位热传感
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3478811
Rawan Alharbi, Chunlin Feng, Sougata Sen, J. Jain, Josiah D. Hester, N. Alshurafa
{"title":"HeatSight: Wearable Low-power Omni Thermal Sensing","authors":"Rawan Alharbi, Chunlin Feng, Sougata Sen, J. Jain, Josiah D. Hester, N. Alshurafa","doi":"10.1145/3460421.3478811","DOIUrl":"https://doi.org/10.1145/3460421.3478811","url":null,"abstract":"Thermal information surrounding a person is a rich source for understanding and identifying personal activities. Different daily activities naturally emit distinct thermal signatures from both the human body and surrounding objects; these signatures exhibit both spatial and temporal components as objects move and thermal energy dissipates, for example, when drinking a cold beverage or smoking a cigarette. We present HeatSight, a wearable system that captures the thermal environment of the wearer and uses machine learning to infer human activity from thermal, spatial, and temporal information in that environment. We achieve this by embedding five low-power thermal sensors in a pentahedron configuration which captures a wide view of the wearer’s body and the objects they interact with. We also design a battery life-saving mechanism that selectively powers only those sensors necessary for detection. With HeatSight, we unlock thermal as an egocentric modality for future interaction research.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124144364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Development of an Aesthetic for a Stroke Rehabilitation System 卒中康复系统美学的发展
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3478828
Galina Mihaleva, F. Kow
{"title":"Development of an Aesthetic for a Stroke Rehabilitation System","authors":"Galina Mihaleva, F. Kow","doi":"10.1145/3460421.3478828","DOIUrl":"https://doi.org/10.1145/3460421.3478828","url":null,"abstract":"Works around stroke rehabilitation devices have largely focused on improving their performance to aid in physical training. By providing better physical training for patients, it allows them to quickly achieve adequate autonomy in life. However, this leaves little development in their aesthetics, potentially failing to address the impact on patients’ self-integrity. This paper proposes the aesthetic development of an immersive multi-sensorial stroke rehabilitation system, entitled MIDAS (Multisensorial Immersive Dynamic Autonomous System), based on the improvement of self-affirmation of aesthetic products. MIDAS consists of three subsystems, a hand exoskeleton, a Virtual Reality (VR), and an Olfactory subsystem. The functional requirement of the system and the design language of the VR subsystem were used as the basis for the aesthetic framework for the rest of the subsystems. The outcomes were a hand exoskeleton with a minimalist linkage system paired with an organic casing, and an olfactory device with a rounded form attached to a frameless face-shield.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122224884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
RFInsole: Batteryless Gait-Monitoring Smart Insole Based on Passive RFID Tags 基于无源RFID标签的无电池步态监测智能鞋垫
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3478810
J. Jo, Huiju Park
{"title":"RFInsole: Batteryless Gait-Monitoring Smart Insole Based on Passive RFID Tags","authors":"J. Jo, Huiju Park","doi":"10.1145/3460421.3478810","DOIUrl":"https://doi.org/10.1145/3460421.3478810","url":null,"abstract":"There are growing demands for daily gait-monitoring smart insoles which are light, soft, and comfortable both in the general population interested in sports and people with disabilities, such as children with cerebral palsy or Autistic Spectrum Disorder. Currently available technologies are basically battery-powered systems, which are bulky, heavy, and not ideal for in-home and everyday use. This study introduces RFInsole, a battery-less smart insole based on RFID (Radio Frequency Identification), tracking the sequence of foot pressures within a stance. RFInsole utilizes passive RFID tags and push button switches, so that the foot pressure can activate the tags at each location. Soft, thin, affordable, and simple structure of the device open broad possibilities to implement the same system into lighter and soft applications such as socks or pressure-sensing garments.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125193227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
MoCapaci: Posture and gesture detection in loose garments using textile cables as capacitive antennas MoCapaci:使用纺织电缆作为电容天线,在宽松的服装中进行姿势和手势检测
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3480418
Hymalai Bello, Bo Zhou, Sungho Suh, P. Lukowicz
{"title":"MoCapaci: Posture and gesture detection in loose garments using textile cables as capacitive antennas","authors":"Hymalai Bello, Bo Zhou, Sungho Suh, P. Lukowicz","doi":"10.1145/3460421.3480418","DOIUrl":"https://doi.org/10.1145/3460421.3480418","url":null,"abstract":"We present a wearable system to detect body postures and gestures that does not require sensors to be firmly fixed to the body or integrated into a tight-fitting garment. The sensing system can be used in a loose piece of clothing such as a coat/blazer. It is based on the well-known theremin musical instrument, which we have unobtrusively integrated into a standard men’s blazer using conductive textile antennas and OpenTheremin hardware as a prototype, the ”MoCaBlazer.” Fourteen participants with diverse body sizes and balanced gender distribution mimicked 20 arm/torso movements with the unbuttoned, single-sized blazer. State-of-the-art deep learning approaches were used to achieve average recognition accuracy results of 97.18% for leave one recording out and 86.25% for user independent recognition.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131082298","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction 使用隐蔽视觉空间注意作为基于脑电图的脑机接口增强AR交互的初步研究
Proceedings of the 2021 ACM International Symposium on Wearable Computers Pub Date : 2021-09-21 DOI: 10.1145/3460421.3480420
Nataliya Kosmyna, Chi-Yun Hu, Yujie Wang, Qiuxuan Wu, C. Scheirer, P. Maes
{"title":"A Pilot Study using Covert Visuospatial Attention as an EEG-based Brain Computer Interface to Enhance AR Interaction","authors":"Nataliya Kosmyna, Chi-Yun Hu, Yujie Wang, Qiuxuan Wu, C. Scheirer, P. Maes","doi":"10.1145/3460421.3480420","DOIUrl":"https://doi.org/10.1145/3460421.3480420","url":null,"abstract":"In this work we propose a prototype which combines an existing augmented reality (AR) headset, the Microsoft HoloLens 2, with an electroencephalogram (EEG) Brain-Computer Interface (BCI) system based on covert visuospatial attention (CVSA) – a process of focusing attention on different regions of the visual field without overt eye movements. In this work we did not rely on any stimulus-driven responses. Fourteen participants were able to test the system over the course of two days. To the best of our knowledge, this system is the first AR EEG-BCI integrated prototype that explores the complementary features of the AR headset like HoloLens 2 and the CVSA paradigm.","PeriodicalId":395295,"journal":{"name":"Proceedings of the 2021 ACM International Symposium on Wearable Computers","volume":" 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120827050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信