Yasmina Andreu, P. Castellano, S. Colantonio, G. Coppini, R. Favilla, D. Germanese, G. Giannakakis, D. Giorgi, M. Larsson, P. Marraccini, M. Martinelli, B. Matuszewski, Matijia Milanic, M. A. Pascali, M. Pediaditis, Giovanni Raccichini, L. Randeberg, O. Salvetti, T. Strömberg
{"title":"Mirror mirror on the wall… An intelligent multisensory mirror for well-being self-assessment","authors":"Yasmina Andreu, P. Castellano, S. Colantonio, G. Coppini, R. Favilla, D. Germanese, G. Giannakakis, D. Giorgi, M. Larsson, P. Marraccini, M. Martinelli, B. Matuszewski, Matijia Milanic, M. A. Pascali, M. Pediaditis, Giovanni Raccichini, L. Randeberg, O. Salvetti, T. Strömberg","doi":"10.1109/ICME.2015.7177468","DOIUrl":null,"url":null,"abstract":"The face reveals the healthy status of an individual, through a combination of physical signs and facial expressions. The project SEMEOTICONS is translating the semeiotic code of the human face into computational descriptors and measures, automatically extracted from videos, images, and 3D scans of the face. SEMEOTICONS is developing a multisensory platform, in the form of a smart mirror, looking for signs related to cardio-metabolic risk. The goal is to enable users to self-monitor their well-being status over time and improve their life-style via tailored user guidance. Building the multisensory mirror requires addressing significant scientific and technological challenges, from touch-less data acquisition, to real-time processing and integration of multimodal data.","PeriodicalId":146271,"journal":{"name":"2015 IEEE International Conference on Multimedia and Expo (ICME)","volume":"71 1-2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Multimedia and Expo (ICME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICME.2015.7177468","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18
Abstract
The face reveals the healthy status of an individual, through a combination of physical signs and facial expressions. The project SEMEOTICONS is translating the semeiotic code of the human face into computational descriptors and measures, automatically extracted from videos, images, and 3D scans of the face. SEMEOTICONS is developing a multisensory platform, in the form of a smart mirror, looking for signs related to cardio-metabolic risk. The goal is to enable users to self-monitor their well-being status over time and improve their life-style via tailored user guidance. Building the multisensory mirror requires addressing significant scientific and technological challenges, from touch-less data acquisition, to real-time processing and integration of multimodal data.