Using Advanced Convolutional Neural Network Approaches to Reveal Patient Age, Gender, and Weight Based on Tongue Images.

IF 2.6 3区 生物学 Q3 BIOTECHNOLOGY & APPLIED MICROBIOLOGY
BioMed Research International Pub Date : 2024-08-01 eCollection Date: 2024-01-01 DOI:10.1155/2024/5551209
Xiaoyan Li, Li Li, Jing Wei, Pengwei Zhang, Volodymyr Turchenko, Naresh Vempala, Evgueni Kabakov, Faisal Habib, Arvind Gupta, Huaxiong Huang, Kang Lee
{"title":"Using Advanced Convolutional Neural Network Approaches to Reveal Patient Age, Gender, and Weight Based on Tongue Images.","authors":"Xiaoyan Li, Li Li, Jing Wei, Pengwei Zhang, Volodymyr Turchenko, Naresh Vempala, Evgueni Kabakov, Faisal Habib, Arvind Gupta, Huaxiong Huang, Kang Lee","doi":"10.1155/2024/5551209","DOIUrl":null,"url":null,"abstract":"<p><p>The human tongue has been long believed to be a window to provide important insights into a patient's health in medicine. The present study introduced a novel approach to predict patient age, gender, and weight inferences based on tongue images using pretrained deep convolutional neural networks (CNNs). Our results demonstrated that the deep CNN models (e.g., ResNeXt) trained on dorsal tongue images produced excellent results for age prediction with a Pearson correlation coefficient of 0.71 and a mean absolute error (MAE) of 8.5 years. We also obtained an excellent classification of gender, with a mean accuracy of 80% and an AUC (area under the receiver operating characteristic curve) of 88%. ResNeXt model also obtained a moderate level of accuracy for weight prediction, with a Pearson correlation coefficient of 0.39 and a MAE of 9.06 kg. These findings support our hypothesis that the human tongue contains crucial information about a patient. This study demonstrated the feasibility of using the pretrained deep CNNs along with a large tongue image dataset to develop computational models to predict patient medical conditions for noninvasive, convenient, and inexpensive patient health monitoring and diagnosis.</p>","PeriodicalId":9007,"journal":{"name":"BioMed Research International","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11309814/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"BioMed Research International","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1155/2024/5551209","RegionNum":3,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"BIOTECHNOLOGY & APPLIED MICROBIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The human tongue has been long believed to be a window to provide important insights into a patient's health in medicine. The present study introduced a novel approach to predict patient age, gender, and weight inferences based on tongue images using pretrained deep convolutional neural networks (CNNs). Our results demonstrated that the deep CNN models (e.g., ResNeXt) trained on dorsal tongue images produced excellent results for age prediction with a Pearson correlation coefficient of 0.71 and a mean absolute error (MAE) of 8.5 years. We also obtained an excellent classification of gender, with a mean accuracy of 80% and an AUC (area under the receiver operating characteristic curve) of 88%. ResNeXt model also obtained a moderate level of accuracy for weight prediction, with a Pearson correlation coefficient of 0.39 and a MAE of 9.06 kg. These findings support our hypothesis that the human tongue contains crucial information about a patient. This study demonstrated the feasibility of using the pretrained deep CNNs along with a large tongue image dataset to develop computational models to predict patient medical conditions for noninvasive, convenient, and inexpensive patient health monitoring and diagnosis.

使用高级卷积神经网络方法,根据舌头图像揭示患者年龄、性别和体重。
长期以来,人们一直认为人的舌头是一扇窗口,能为医学界提供了解病人健康状况的重要信息。本研究引入了一种新方法,利用预训练的深度卷积神经网络(CNN)根据舌头图像预测患者的年龄、性别和体重推断。我们的研究结果表明,在舌背图像上训练的深度卷积神经网络模型(如 ResNeXt)在年龄预测方面取得了出色的结果,其皮尔逊相关系数为 0.71,平均绝对误差 (MAE) 为 8.5 岁。我们还获得了出色的性别分类结果,平均准确率为 80%,AUC(接收器工作特征曲线下面积)为 88%。ResNeXt 模型对体重预测的准确率也达到了中等水平,皮尔逊相关系数为 0.39,最大误差为 9.06 千克。这些发现支持了我们的假设,即人的舌头包含有关病人的重要信息。这项研究证明了使用预训练的深度 CNN 和大型舌头图像数据集来开发预测患者病情的计算模型的可行性,从而实现无创、便捷、廉价的患者健康监测和诊断。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
BioMed Research International
BioMed Research International BIOTECHNOLOGY & APPLIED MICROBIOLOGY-MEDICINE, RESEARCH & EXPERIMENTAL
CiteScore
6.70
自引率
0.00%
发文量
1942
审稿时长
19 weeks
期刊介绍: BioMed Research International is a peer-reviewed, Open Access journal that publishes original research articles, review articles, and clinical studies covering a wide range of subjects in life sciences and medicine. The journal is divided into 55 subject areas.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信