Jindong Liu, Edward Johns, L. Atallah, C. Pettitt, Benny P. L. Lo, G. Frost, Guang-Zhong Yang
{"title":"一种基于可穿戴传感器的智能摄食监测系统","authors":"Jindong Liu, Edward Johns, L. Atallah, C. Pettitt, Benny P. L. Lo, G. Frost, Guang-Zhong Yang","doi":"10.1109/BSN.2012.11","DOIUrl":null,"url":null,"abstract":"The prevalence of obesity worldwide presents a great challenge to existing healthcare systems. There is a general need for pervasive monitoring of the dietary behaviour of those who are at risk of co-morbidities. Currently, however, there is no accurate method of assessing the nutritional intake of people in their home environment. Traditional methods require subjects to manually respond to questionnaires for analysis, which is subjective, prone to errors, and difficult to ensure consistency and compliance. In this paper, we present a wearable sensor platform that autonomously provides detailed information regarding a subject's dietary habits. The sensor consists of a microphone and a camera and is worn discretely on the ear. Sound features are extracted in real-time and if a chewing activity is classified, the camera captures a video sequence for further analysis. From this sequence, a number of key frames are extracted to represent important episodes during the course of a meal. Results show a high classification rate of chewing activities, and the visual log demonstrates a detailed overview of the subject's food intake that is difficult to quantify from manually-acquired food records.","PeriodicalId":101720,"journal":{"name":"2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"109","resultStr":"{\"title\":\"An Intelligent Food-Intake Monitoring System Using Wearable Sensors\",\"authors\":\"Jindong Liu, Edward Johns, L. Atallah, C. Pettitt, Benny P. L. Lo, G. Frost, Guang-Zhong Yang\",\"doi\":\"10.1109/BSN.2012.11\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The prevalence of obesity worldwide presents a great challenge to existing healthcare systems. There is a general need for pervasive monitoring of the dietary behaviour of those who are at risk of co-morbidities. Currently, however, there is no accurate method of assessing the nutritional intake of people in their home environment. Traditional methods require subjects to manually respond to questionnaires for analysis, which is subjective, prone to errors, and difficult to ensure consistency and compliance. In this paper, we present a wearable sensor platform that autonomously provides detailed information regarding a subject's dietary habits. The sensor consists of a microphone and a camera and is worn discretely on the ear. Sound features are extracted in real-time and if a chewing activity is classified, the camera captures a video sequence for further analysis. From this sequence, a number of key frames are extracted to represent important episodes during the course of a meal. Results show a high classification rate of chewing activities, and the visual log demonstrates a detailed overview of the subject's food intake that is difficult to quantify from manually-acquired food records.\",\"PeriodicalId\":101720,\"journal\":{\"name\":\"2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"109\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BSN.2012.11\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BSN.2012.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Intelligent Food-Intake Monitoring System Using Wearable Sensors
The prevalence of obesity worldwide presents a great challenge to existing healthcare systems. There is a general need for pervasive monitoring of the dietary behaviour of those who are at risk of co-morbidities. Currently, however, there is no accurate method of assessing the nutritional intake of people in their home environment. Traditional methods require subjects to manually respond to questionnaires for analysis, which is subjective, prone to errors, and difficult to ensure consistency and compliance. In this paper, we present a wearable sensor platform that autonomously provides detailed information regarding a subject's dietary habits. The sensor consists of a microphone and a camera and is worn discretely on the ear. Sound features are extracted in real-time and if a chewing activity is classified, the camera captures a video sequence for further analysis. From this sequence, a number of key frames are extracted to represent important episodes during the course of a meal. Results show a high classification rate of chewing activities, and the visual log demonstrates a detailed overview of the subject's food intake that is difficult to quantify from manually-acquired food records.