IRIS:利用可穿戴传感技术,捕捉店内消费者的零售信息

Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta
{"title":"IRIS:利用可穿戴传感技术,捕捉店内消费者的零售信息","authors":"Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta","doi":"10.1109/PERCOM.2016.7456526","DOIUrl":null,"url":null,"abstract":"We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.","PeriodicalId":275797,"journal":{"name":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":"{\"title\":\"IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers\",\"authors\":\"Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta\",\"doi\":\"10.1109/PERCOM.2016.7456526\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.\",\"PeriodicalId\":275797,\"journal\":{\"name\":\"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"29\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOM.2016.7456526\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOM.2016.7456526","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29

摘要

我们研究了购物者携带智能手机和智能手表结合使用的可能性,以深入了解购物者在零售店内的行为。拟议的IRIS框架使用标准的机车和手势微活动作为构建块来定义新的复合特征,这些特征有助于分类购物者与单个物品的交互/体验的不同方面,以及整个购物事件或商店的属性。除了定义这些新特征之外,IRIS还构建了一种新的分割算法,该算法通过结合基于特征的标记、变化点检测和基于变阶hmm的序列预测,将整个购物事件的持续时间划分为原子项目级交互。实验收集了25名购物者的50个现实生活中的杂货店购物事件,我们表明IRIS可以以大约的精度划分商品级交互。91%,并随后以超过90%的准确率描述商品和剧集级别的购物者行为。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers
We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信