Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta
{"title":"IRIS:利用可穿戴传感技术,捕捉店内消费者的零售信息","authors":"Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta","doi":"10.1109/PERCOM.2016.7456526","DOIUrl":null,"url":null,"abstract":"We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.","PeriodicalId":275797,"journal":{"name":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":"{\"title\":\"IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers\",\"authors\":\"Meera Radhakrishnan, S. Eswaran, Archan Misra, D. Chander, K. Dasgupta\",\"doi\":\"10.1109/PERCOM.2016.7456526\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.\",\"PeriodicalId\":275797,\"journal\":{\"name\":\"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"29\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOM.2016.7456526\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Pervasive Computing and Communications (PerCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOM.2016.7456526","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
IRIS: Tapping wearable sensing to capture in-store retail insights on shoppers
We investigate the possibility of using a combination of a smartphone and a smartwatch, carried by a shopper, to get insights into the shopper's behavior inside a retail store. The proposed IRIS framework uses standard locomotive and gestural micro-activities as building blocks to define novel composite features that help classify different facets of a shopper's interaction/experience with individual items, as well as attributes of the overall shopping episode or the store. Besides defining such novel features, IRIS builds a novel segmentation algorithm, which partitions the duration of an entire shopping episode into atomic item-level interactions, by using a combination of feature-based landmarking, change point detection and variable-order HMM-based sequence prediction. Experiments with 50 real-life grocery shopping episodes, collected from 25 shoppers, we show that IRIS can demarcate item-level interactions with an accuracy of approx. 91%, and subsequently characterize item-and-episode level shopper behavior with accuracies of over 90%.