Novel Application of an RGB-D Camera for Face-Direction Measurements and Object Detection: Towards Understanding Museum Visitors' Experiences

Nanami Saito, F. Kusunoki, S. Inagaki, H. Mizoguchi
{"title":"Novel Application of an RGB-D Camera for Face-Direction Measurements and Object Detection: Towards Understanding Museum Visitors' Experiences","authors":"Nanami Saito, F. Kusunoki, S. Inagaki, H. Mizoguchi","doi":"10.1109/ICST46873.2019.9047675","DOIUrl":null,"url":null,"abstract":"Gaze measurement techniques can be used to understand the experiences of visitors at museums. In previous research, researchers used a wearable sensor such as an eye tracker to estimate what a visitor was looking at. Although it is possible to measure this experimentally by using a wearable sensor, it is difficult to measure the experience of all museum visitors with such an approach. Therefore, in this study, we propose a method for measuring visitors' experiences with non-wearable type sensors. In this method, we measure the face direction, which is said to be related to the gaze direction of the visitors, by using non-wearable Kinect sensors. We also have to detect objects because we need information about what people are looking at to measure the visitors' experiences. In this paper, we describe the method to estimate the gaze target of the visitor, which is referred to as the visitor's experience, based on the face-direction measurements and object detection data collected simultaneously, and an experiment is conducted to evaluate its effectiveness.","PeriodicalId":344937,"journal":{"name":"2019 13th International Conference on Sensing Technology (ICST)","volume":"100 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 13th International Conference on Sensing Technology (ICST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICST46873.2019.9047675","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Gaze measurement techniques can be used to understand the experiences of visitors at museums. In previous research, researchers used a wearable sensor such as an eye tracker to estimate what a visitor was looking at. Although it is possible to measure this experimentally by using a wearable sensor, it is difficult to measure the experience of all museum visitors with such an approach. Therefore, in this study, we propose a method for measuring visitors' experiences with non-wearable type sensors. In this method, we measure the face direction, which is said to be related to the gaze direction of the visitors, by using non-wearable Kinect sensors. We also have to detect objects because we need information about what people are looking at to measure the visitors' experiences. In this paper, we describe the method to estimate the gaze target of the visitor, which is referred to as the visitor's experience, based on the face-direction measurements and object detection data collected simultaneously, and an experiment is conducted to evaluate its effectiveness.
RGB-D相机在面部方向测量和物体检测中的新应用:对博物馆游客体验的理解
凝视测量技术可以用来了解博物馆游客的体验。在之前的研究中,研究人员使用眼动仪等可穿戴传感器来估计访问者在看什么。虽然可以通过使用可穿戴传感器进行实验测量,但很难用这种方法测量所有博物馆游客的体验。因此,在本研究中,我们提出了一种使用非穿戴式传感器测量游客体验的方法。在这种方法中,我们通过使用非穿戴式Kinect传感器来测量面部方向,据说这与访问者的凝视方向有关。我们还必须检测物体,因为我们需要关于人们在看什么的信息来衡量游客的体验。本文描述了一种基于同时采集的人脸方向测量和目标检测数据来估计访问者注视目标(即访问者体验)的方法,并通过实验对其有效性进行了评价。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信