利用深度学习方法从颌面部 X 光片估测性别。

IF 1.9 4区 医学 Q2 DENTISTRY, ORAL SURGERY & MEDICINE
Hiroki Hase, Y. Mine, S. Okazaki, Yuki Yoshimi, S. Ito, Tzu-Yu Peng, Mizuho Sano, Yuma Koizumi, Naoya Kakimoto, Kotaro Tanimoto, Takeshi Murayama
{"title":"利用深度学习方法从颌面部 X 光片估测性别。","authors":"Hiroki Hase, Y. Mine, S. Okazaki, Yuki Yoshimi, S. Ito, Tzu-Yu Peng, Mizuho Sano, Yuma Koizumi, Naoya Kakimoto, Kotaro Tanimoto, Takeshi Murayama","doi":"10.4012/dmj.2023-253","DOIUrl":null,"url":null,"abstract":"The purpose of this study was to construct deep learning models for more efficient and reliable sex estimation. Two deep learning models, VGG16 and DenseNet-121, were used in this retrospective study. In total, 600 lateral cephalograms were analyzed. A saliency map was generated by gradient-weighted class activation mapping for each output. The two deep learning models achieved high values in each performance metric according to accuracy, sensitivity (recall), precision, F1 score, and areas under the receiver operating characteristic curve. Both models showed substantial differences in the positions indicated in saliency maps for male and female images. The positions in saliency maps also differed between VGG16 and DenseNet-121, regardless of sex. This analysis of our proposed system suggested that sex estimation from lateral cephalograms can be achieved with high accuracy using deep learning.","PeriodicalId":11065,"journal":{"name":"Dental materials journal","volume":null,"pages":null},"PeriodicalIF":1.9000,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sex estimation from maxillofacial radiographs using a deep learning approach.\",\"authors\":\"Hiroki Hase, Y. Mine, S. Okazaki, Yuki Yoshimi, S. Ito, Tzu-Yu Peng, Mizuho Sano, Yuma Koizumi, Naoya Kakimoto, Kotaro Tanimoto, Takeshi Murayama\",\"doi\":\"10.4012/dmj.2023-253\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The purpose of this study was to construct deep learning models for more efficient and reliable sex estimation. Two deep learning models, VGG16 and DenseNet-121, were used in this retrospective study. In total, 600 lateral cephalograms were analyzed. A saliency map was generated by gradient-weighted class activation mapping for each output. The two deep learning models achieved high values in each performance metric according to accuracy, sensitivity (recall), precision, F1 score, and areas under the receiver operating characteristic curve. Both models showed substantial differences in the positions indicated in saliency maps for male and female images. The positions in saliency maps also differed between VGG16 and DenseNet-121, regardless of sex. This analysis of our proposed system suggested that sex estimation from lateral cephalograms can be achieved with high accuracy using deep learning.\",\"PeriodicalId\":11065,\"journal\":{\"name\":\"Dental materials journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2024-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Dental materials journal\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.4012/dmj.2023-253\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"DENTISTRY, ORAL SURGERY & MEDICINE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Dental materials journal","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.4012/dmj.2023-253","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"DENTISTRY, ORAL SURGERY & MEDICINE","Score":null,"Total":0}
引用次数: 0

摘要

本研究的目的是构建深度学习模型,以实现更高效、更可靠的性别估计。这项回顾性研究使用了两个深度学习模型,即 VGG16 和 DenseNet-121。总共分析了 600 张侧头颅图像。通过梯度加权类激活映射为每个输出生成了一个突出图。两个深度学习模型在准确度、灵敏度(召回率)、精确度、F1得分和接收者工作特征曲线下面积等各项性能指标上都达到了较高的数值。这两个模型在男性和女性图像的显著性图谱中显示的位置存在很大差异。无论性别如何,VGG16 和 DenseNet-121 在突出图中的位置也存在差异。对我们提出的系统的分析表明,利用深度学习可以从侧头颅图像中高精度地估计性别。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sex estimation from maxillofacial radiographs using a deep learning approach.
The purpose of this study was to construct deep learning models for more efficient and reliable sex estimation. Two deep learning models, VGG16 and DenseNet-121, were used in this retrospective study. In total, 600 lateral cephalograms were analyzed. A saliency map was generated by gradient-weighted class activation mapping for each output. The two deep learning models achieved high values in each performance metric according to accuracy, sensitivity (recall), precision, F1 score, and areas under the receiver operating characteristic curve. Both models showed substantial differences in the positions indicated in saliency maps for male and female images. The positions in saliency maps also differed between VGG16 and DenseNet-121, regardless of sex. This analysis of our proposed system suggested that sex estimation from lateral cephalograms can be achieved with high accuracy using deep learning.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Dental materials journal
Dental materials journal 医学-材料科学:生物材料
CiteScore
4.60
自引率
4.00%
发文量
102
审稿时长
3 months
期刊介绍: Dental Materials Journal is a peer review journal published by the Japanese Society for Dental Materials and Devises aiming to introduce the progress of the basic and applied sciences in dental materials and biomaterials. The dental materials-related clinical science and instrumental technologies are also within the scope of this journal. The materials dealt include synthetic polymers, ceramics, metals and tissue-derived biomaterials. Forefront dental materials and biomaterials used in developing filed, such as tissue engineering, bioengineering and artificial intelligence, are positively considered for the review as well. Recent acceptance rate of the submitted manuscript in the journal is around 30%.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信