Addythia Saphala, Rui Zhang, Trinh Nam Thái, O. Amft
{"title":"Non-contact temporalis muscle monitoring to detect eating in free-living using smart eyeglasses","authors":"Addythia Saphala, Rui Zhang, Trinh Nam Thái, O. Amft","doi":"10.1109/BSN56160.2022.9928447","DOIUrl":null,"url":null,"abstract":"We investigate non-contact sensing of temporalis muscle contraction in smart eyeglasses frames to detect eating activity. Our approach is based on infra-red proximity sensors that were integrated into sleek eyeglasses frame temples. The proximity sensors capture distance variations between frame temple and skin at the frontal, hair-free section of the temporal head region. To analyse distance variations during chewing and other activities, we initially perform an in-lab study, where proximity signals and Electromyography (EMG) readings were simultaneously recorded while eating foods with varying texture and hardness. Subsequently, we performed a free-living study with 15 participants wearing integrated, fully functional 3Dprinted eyeglasses frames, including proximity sensors, processing, storage, and battery, for an average recording duration of 8.3hours per participant. We propose a new chewing sequence and eating event detection method to process proximity signals. Free-living retrieval performance ranged between the precision of 0.83 and 0.68, and recall of 0.93 and 0.90, for personalised and general detection models, respectively. We conclude that noncontact proximity-based estimation of chewing sequences and eating integrated into eyeglasses frames is a highly promising tool for automated dietary monitoring. While personalised models can improve performance, already general models can be practically useful to minimise manual food journalling.","PeriodicalId":150990,"journal":{"name":"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE-EMBS International Conference on Wearable and Implantable Body Sensor Networks (BSN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BSN56160.2022.9928447","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We investigate non-contact sensing of temporalis muscle contraction in smart eyeglasses frames to detect eating activity. Our approach is based on infra-red proximity sensors that were integrated into sleek eyeglasses frame temples. The proximity sensors capture distance variations between frame temple and skin at the frontal, hair-free section of the temporal head region. To analyse distance variations during chewing and other activities, we initially perform an in-lab study, where proximity signals and Electromyography (EMG) readings were simultaneously recorded while eating foods with varying texture and hardness. Subsequently, we performed a free-living study with 15 participants wearing integrated, fully functional 3Dprinted eyeglasses frames, including proximity sensors, processing, storage, and battery, for an average recording duration of 8.3hours per participant. We propose a new chewing sequence and eating event detection method to process proximity signals. Free-living retrieval performance ranged between the precision of 0.83 and 0.68, and recall of 0.93 and 0.90, for personalised and general detection models, respectively. We conclude that noncontact proximity-based estimation of chewing sequences and eating integrated into eyeglasses frames is a highly promising tool for automated dietary monitoring. While personalised models can improve performance, already general models can be practically useful to minimise manual food journalling.