A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet

Hanyuan Fu, Thomas Bonis, V. Renaudin, Ni Zhu
{"title":"A Computer Vision Approach for Pedestrian Walking Direction Estimation with Wearable Inertial Sensors: PatternNet","authors":"Hanyuan Fu, Thomas Bonis, V. Renaudin, Ni Zhu","doi":"10.1109/PLANS53410.2023.10140028","DOIUrl":null,"url":null,"abstract":"In this paper, we propose an image-based neural network approach (PatternNet) for walking direction estimation with wearable inertial sensors. Gait event segmentation and projection are used to convert the inertial signals to image-like tabular samples, from which a Convolutional neural network (CNN) extracts geometrical features for walking direction inference. To embrace the diversity of individual walking characteristics and different ways to carry the device, tailor-made models are constructed based on individual users' gait characteristics and the device-carrying mode. Experimental assessments of the proposed method and a competing method (RoNIN) are carried out in real-life situations and over 3 km total walking distance, covering indoor and outdoor environments, involving both sighted and visually impaired volunteers carrying the device in three different ways: texting, swinging and in a jacket pocket. PatternNet estimates the walking directions with a mean accuracy between 7 to 10 degrees for the three test persons and is 1.5 times better than RONIN estimates.","PeriodicalId":344794,"journal":{"name":"2023 IEEE/ION Position, Location and Navigation Symposium (PLANS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE/ION Position, Location and Navigation Symposium (PLANS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PLANS53410.2023.10140028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we propose an image-based neural network approach (PatternNet) for walking direction estimation with wearable inertial sensors. Gait event segmentation and projection are used to convert the inertial signals to image-like tabular samples, from which a Convolutional neural network (CNN) extracts geometrical features for walking direction inference. To embrace the diversity of individual walking characteristics and different ways to carry the device, tailor-made models are constructed based on individual users' gait characteristics and the device-carrying mode. Experimental assessments of the proposed method and a competing method (RoNIN) are carried out in real-life situations and over 3 km total walking distance, covering indoor and outdoor environments, involving both sighted and visually impaired volunteers carrying the device in three different ways: texting, swinging and in a jacket pocket. PatternNet estimates the walking directions with a mean accuracy between 7 to 10 degrees for the three test persons and is 1.5 times better than RONIN estimates.
基于可穿戴惯性传感器的行人行走方向估计的计算机视觉方法:PatternNet
在本文中,我们提出了一种基于图像的神经网络方法(PatternNet)来估计可穿戴惯性传感器的行走方向。采用步态事件分割和投影将惯性信号转化为类图像的表格样本,卷积神经网络(CNN)从中提取几何特征进行行走方向推断。为了适应个体行走特征的多样性和不同的设备携带方式,根据个体用户的步态特征和设备携带方式构建量身定制的模型。对所提出的方法和竞争方法(RoNIN)的实验评估在现实生活中进行,总步行距离超过3公里,涵盖室内和室外环境,包括视力正常和视力受损的志愿者,他们以三种不同的方式携带该设备:发短信、摆动和放在夹克口袋里。PatternNet以7到10度的平均精度估计了三个测试者的行走方向,比RONIN估计的要好1.5倍。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信