{"title":"成分导向的RGB-D营养评估融合网络","authors":"Zhihui Feng;Hao Xiong;Weiqing Min;Sujuan Hou;Huichuan Duan;Zhonghua Liu;Shuqiang Jiang","doi":"10.1109/TAFE.2024.3493332","DOIUrl":null,"url":null,"abstract":"The nutritional value of agricultural products is an important indicator for evaluating their quality, which directly affects people's dietary choices and overall well-being. Nutritional assessment studies provide a scientific basis for the production, processing, and marketing of food by analyzing the nutrients they contain. Traditional methods often struggle with suboptimal accuracy and can be time consuming, as well as a shortage of professionals. The progress in artificial intelligence has revolutionized dietary health by offering more accessible methods for food nutritional assessment using vision-based approaches. However, existing vision-based methods using RGB images often face challenges due to varying lighting conditions, impacting the accuracy of nutritional assessment. An alternative is the RGB-D fusion method, which combines RGB images and depth maps. Yet, these methods typically rely on simple fusion techniques that do not ensure precise assessment. Additionally, current vision-based methods struggle to detect small components like oils and sugars on food surfaces, crucial for determining ingredient information and ensuring accurate nutritional assessment. In this pursuit, we propose a novel ingredient-guided RGB-D fusion network that integrates RGB images with depth maps and enables more reliable nutritional assessment guided by ingredient information. Specifically, the multifrequency bimodality fusion module is designed to leverage the correlation between the RGB image and the depth map within the frequency domain. Furthermore, the progressive-fusion module and ingredient-guided module leverage ingredient information to explore the potential correlation between ingredients and nutrients, thereby enhancing the guidance for nutritional assessment learning. We evaluate our approach on a variety of ablation settings on Nutrition5k, where it consistently outperforms state-of-the-art methods.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"3 1","pages":"156-166"},"PeriodicalIF":0.0000,"publicationDate":"2024-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Ingredient-Guided RGB-D Fusion Network for Nutritional Assessment\",\"authors\":\"Zhihui Feng;Hao Xiong;Weiqing Min;Sujuan Hou;Huichuan Duan;Zhonghua Liu;Shuqiang Jiang\",\"doi\":\"10.1109/TAFE.2024.3493332\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The nutritional value of agricultural products is an important indicator for evaluating their quality, which directly affects people's dietary choices and overall well-being. Nutritional assessment studies provide a scientific basis for the production, processing, and marketing of food by analyzing the nutrients they contain. Traditional methods often struggle with suboptimal accuracy and can be time consuming, as well as a shortage of professionals. The progress in artificial intelligence has revolutionized dietary health by offering more accessible methods for food nutritional assessment using vision-based approaches. However, existing vision-based methods using RGB images often face challenges due to varying lighting conditions, impacting the accuracy of nutritional assessment. An alternative is the RGB-D fusion method, which combines RGB images and depth maps. Yet, these methods typically rely on simple fusion techniques that do not ensure precise assessment. Additionally, current vision-based methods struggle to detect small components like oils and sugars on food surfaces, crucial for determining ingredient information and ensuring accurate nutritional assessment. In this pursuit, we propose a novel ingredient-guided RGB-D fusion network that integrates RGB images with depth maps and enables more reliable nutritional assessment guided by ingredient information. Specifically, the multifrequency bimodality fusion module is designed to leverage the correlation between the RGB image and the depth map within the frequency domain. Furthermore, the progressive-fusion module and ingredient-guided module leverage ingredient information to explore the potential correlation between ingredients and nutrients, thereby enhancing the guidance for nutritional assessment learning. We evaluate our approach on a variety of ablation settings on Nutrition5k, where it consistently outperforms state-of-the-art methods.\",\"PeriodicalId\":100637,\"journal\":{\"name\":\"IEEE Transactions on AgriFood Electronics\",\"volume\":\"3 1\",\"pages\":\"156-166\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-12-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on AgriFood Electronics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10774071/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on AgriFood Electronics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10774071/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Ingredient-Guided RGB-D Fusion Network for Nutritional Assessment
The nutritional value of agricultural products is an important indicator for evaluating their quality, which directly affects people's dietary choices and overall well-being. Nutritional assessment studies provide a scientific basis for the production, processing, and marketing of food by analyzing the nutrients they contain. Traditional methods often struggle with suboptimal accuracy and can be time consuming, as well as a shortage of professionals. The progress in artificial intelligence has revolutionized dietary health by offering more accessible methods for food nutritional assessment using vision-based approaches. However, existing vision-based methods using RGB images often face challenges due to varying lighting conditions, impacting the accuracy of nutritional assessment. An alternative is the RGB-D fusion method, which combines RGB images and depth maps. Yet, these methods typically rely on simple fusion techniques that do not ensure precise assessment. Additionally, current vision-based methods struggle to detect small components like oils and sugars on food surfaces, crucial for determining ingredient information and ensuring accurate nutritional assessment. In this pursuit, we propose a novel ingredient-guided RGB-D fusion network that integrates RGB images with depth maps and enables more reliable nutritional assessment guided by ingredient information. Specifically, the multifrequency bimodality fusion module is designed to leverage the correlation between the RGB image and the depth map within the frequency domain. Furthermore, the progressive-fusion module and ingredient-guided module leverage ingredient information to explore the potential correlation between ingredients and nutrients, thereby enhancing the guidance for nutritional assessment learning. We evaluate our approach on a variety of ablation settings on Nutrition5k, where it consistently outperforms state-of-the-art methods.