When2Trigger: Evaluation Trade-offs in Vision-based Real-Time Eating Detection Systems.

Soroush Shahi, Glenn Fernandes, Chris Romano, Nabil Alshurafa
{"title":"When2Trigger: Evaluation Trade-offs in Vision-based Real-Time Eating Detection Systems.","authors":"Soroush Shahi, Glenn Fernandes, Chris Romano, Nabil Alshurafa","doi":"10.1109/bsn63547.2024.10780481","DOIUrl":null,"url":null,"abstract":"<p><p>Wearable camera and thermal sensing systems are increasingly used for real-time eating detection and timely notifications to remind users to log their meals. However, confounding gestures such as irrelevant hand movements can cause false device confirmations of eating in real-time. Delaying the device confirmation of an eating episode, until the system is certain, can improve accuracy of eating detection, but prevents the capture of shorter bouts of eating. Balancing the trade-off between errors and detection delay is key to developing effective methods that provide immediate user feedback. This paper presents a real-time, hand-object-based method for automated detection of eating and drinking gestures and identifies the minimum number of gestures needed to reliably detect an eating episode. Unlike prior work, our method considers both hand motion and the object-in-hand and uses a low-power thermal sensor to reduce false positives. We evaluated our method on 36 participants, 28 of whom wore a wearable camera for up to 14 days in free-living environments. The results show that eating episodes can be accurately detected using 10 gestures or within the first 1.5 minutes of the eating episode, achieving an F1-score of 89.0%. Our findings provide evaluation guidelines for designing real-time intervention systems to address problematic eating behaviors.</p>","PeriodicalId":72028,"journal":{"name":"... International Conference on Wearable and Implantable Body Sensor Networks. International Conference on Wearable and Implantable Body Sensor Networks","volume":"2024 ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11864366/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"... International Conference on Wearable and Implantable Body Sensor Networks. International Conference on Wearable and Implantable Body Sensor Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/bsn63547.2024.10780481","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/12/11 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Wearable camera and thermal sensing systems are increasingly used for real-time eating detection and timely notifications to remind users to log their meals. However, confounding gestures such as irrelevant hand movements can cause false device confirmations of eating in real-time. Delaying the device confirmation of an eating episode, until the system is certain, can improve accuracy of eating detection, but prevents the capture of shorter bouts of eating. Balancing the trade-off between errors and detection delay is key to developing effective methods that provide immediate user feedback. This paper presents a real-time, hand-object-based method for automated detection of eating and drinking gestures and identifies the minimum number of gestures needed to reliably detect an eating episode. Unlike prior work, our method considers both hand motion and the object-in-hand and uses a low-power thermal sensor to reduce false positives. We evaluated our method on 36 participants, 28 of whom wore a wearable camera for up to 14 days in free-living environments. The results show that eating episodes can be accurately detected using 10 gestures or within the first 1.5 minutes of the eating episode, achieving an F1-score of 89.0%. Our findings provide evaluation guidelines for designing real-time intervention systems to address problematic eating behaviors.

何时触发:基于视觉的实时进食检测系统的评估权衡。
可穿戴摄像头和热传感系统越来越多地用于实时饮食检测和及时通知,以提醒用户记录他们的膳食。然而,令人困惑的手势,如不相关的手部动作,可能会导致设备对实时进食的错误确认。延迟设备对进食事件的确认,直到系统确定,可以提高进食检测的准确性,但会阻止捕获较短的进食事件。在错误和检测延迟之间取得平衡是开发提供即时用户反馈的有效方法的关键。本文提出了一种实时的、基于手的自动检测饮食手势的方法,并确定了可靠地检测饮食事件所需的最小手势数量。与之前的工作不同,我们的方法同时考虑手部运动和手中的物体,并使用低功耗热传感器来减少误报。我们在36名参与者身上评估了我们的方法,其中28人在自由生活环境中佩戴可穿戴相机长达14天。结果表明,进食事件可以通过10个手势或在进食事件的前1.5分钟内准确检测到,f1得分为89.0%。我们的研究结果为设计实时干预系统来解决问题饮食行为提供了评估指南。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信