{"title":"A Health Profiling Framework for Children Leveraging Multimodal Learning Based on Ambient Sensor Signals","authors":"Zhihan Jiang, Cong Xie, Edith C. H. Ngai","doi":"10.1109/ICASSPW59220.2023.10192968","DOIUrl":null,"url":null,"abstract":"Traditional methods for health profiling are usually expensive and require specialized expertise. The growing prevalence and development of wearable devices have made it feasible to collect ambient sensor signals, providing us with new opportunities to profile children’s health in a cost-effective and comprehensive manner. Inspired by recent works in multimodal learning, we propose a health profiling framework for children. First, we extract context and motion patterns from their personal and family characteristics and acceleration signals. Then, context and motion embeddings are generated by two encoders and input into a lightweight neural network to profile children’s health from the perspectives of physical activity intensity, physical functioning, health confidence, psychosocial functioning, resilience, and connectedness. We evaluate the proposed method on real-world datasets, and the results show its outstanding performance. Specifically, the context pattern is effective in profiling children’s health, while the motion pattern is significantly effective in assessing children’s physical activity intensity.","PeriodicalId":158726,"journal":{"name":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","volume":"140 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSPW59220.2023.10192968","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Traditional methods for health profiling are usually expensive and require specialized expertise. The growing prevalence and development of wearable devices have made it feasible to collect ambient sensor signals, providing us with new opportunities to profile children’s health in a cost-effective and comprehensive manner. Inspired by recent works in multimodal learning, we propose a health profiling framework for children. First, we extract context and motion patterns from their personal and family characteristics and acceleration signals. Then, context and motion embeddings are generated by two encoders and input into a lightweight neural network to profile children’s health from the perspectives of physical activity intensity, physical functioning, health confidence, psychosocial functioning, resilience, and connectedness. We evaluate the proposed method on real-world datasets, and the results show its outstanding performance. Specifically, the context pattern is effective in profiling children’s health, while the motion pattern is significantly effective in assessing children’s physical activity intensity.