Enea Cippitelli, Samuele Gasparrini, E. Gambi, S. Spinsante
{"title":"通过RGB和深度信息融合监测不显眼的吸入动作","authors":"Enea Cippitelli, Samuele Gasparrini, E. Gambi, S. Spinsante","doi":"10.1109/ICCP.2016.7737116","DOIUrl":null,"url":null,"abstract":"This paper presents a solution, based on the data fusion approach, to monitor the food and drink intake actions of elderly people during their activities of daily living. The system is non-intrusive and completely transparent to the user. The developed monitor technique is able to overcome the need of relying on direct assistance or diary-based self-monitoring. The proposed solution exploits a depth and RGB camera placed on the ceiling, in top-down view. Starting from the depth information, an adapted version of the Self-Organized Map algorithm is applied to a defined skeleton model, to track the person's movements. The RGB stream is used to recognize specific elements located on the table during eating-related activities, such as glasses. The fusion of these processed data leads to the identification of specific intake behaviours. The system performances have been successfully tested with healthy volunteers of different age and height; the results are promising and confirm the system capacity to recognize the intake activity.","PeriodicalId":343658,"journal":{"name":"2016 IEEE 12th International Conference on Intelligent Computer Communication and Processing (ICCP)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Unobtrusive intake actions monitoring through RGB and depth information fusion\",\"authors\":\"Enea Cippitelli, Samuele Gasparrini, E. Gambi, S. Spinsante\",\"doi\":\"10.1109/ICCP.2016.7737116\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a solution, based on the data fusion approach, to monitor the food and drink intake actions of elderly people during their activities of daily living. The system is non-intrusive and completely transparent to the user. The developed monitor technique is able to overcome the need of relying on direct assistance or diary-based self-monitoring. The proposed solution exploits a depth and RGB camera placed on the ceiling, in top-down view. Starting from the depth information, an adapted version of the Self-Organized Map algorithm is applied to a defined skeleton model, to track the person's movements. The RGB stream is used to recognize specific elements located on the table during eating-related activities, such as glasses. The fusion of these processed data leads to the identification of specific intake behaviours. The system performances have been successfully tested with healthy volunteers of different age and height; the results are promising and confirm the system capacity to recognize the intake activity.\",\"PeriodicalId\":343658,\"journal\":{\"name\":\"2016 IEEE 12th International Conference on Intelligent Computer Communication and Processing (ICCP)\",\"volume\":\"80 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE 12th International Conference on Intelligent Computer Communication and Processing (ICCP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCP.2016.7737116\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE 12th International Conference on Intelligent Computer Communication and Processing (ICCP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCP.2016.7737116","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Unobtrusive intake actions monitoring through RGB and depth information fusion
This paper presents a solution, based on the data fusion approach, to monitor the food and drink intake actions of elderly people during their activities of daily living. The system is non-intrusive and completely transparent to the user. The developed monitor technique is able to overcome the need of relying on direct assistance or diary-based self-monitoring. The proposed solution exploits a depth and RGB camera placed on the ceiling, in top-down view. Starting from the depth information, an adapted version of the Self-Organized Map algorithm is applied to a defined skeleton model, to track the person's movements. The RGB stream is used to recognize specific elements located on the table during eating-related activities, such as glasses. The fusion of these processed data leads to the identification of specific intake behaviours. The system performances have been successfully tested with healthy volunteers of different age and height; the results are promising and confirm the system capacity to recognize the intake activity.