Palmprint Phenotype Feature Extraction and Classification Based on Deep Learning.

IF 3.7 Q2 GENETICS & HEREDITY
Yu Fan, Jinxi Li, Shaoying Song, Haiguo Zhang, Sijia Wang, Guangtao Zhai
{"title":"Palmprint Phenotype Feature Extraction and Classification Based on Deep Learning.","authors":"Yu Fan,&nbsp;Jinxi Li,&nbsp;Shaoying Song,&nbsp;Haiguo Zhang,&nbsp;Sijia Wang,&nbsp;Guangtao Zhai","doi":"10.1007/s43657-022-00063-0","DOIUrl":null,"url":null,"abstract":"<p><p>Palmprints are of long practical and cultural interest. Palmprint principal lines, also called primary palmar lines, are one of the most dominant palmprint features and do not change over the lifespan. The existing methods utilize filters and edge detection operators to get the principal lines from the palm region of interest (ROI), but can not distinguish the principal lines from fine wrinkles. This paper proposes a novel deep-learning architecture to extract palmprint principal lines, which could greatly reduce the influence of fine wrinkles, and classify palmprint phenotypes further from 2D palmprint images. This architecture includes three modules, ROI extraction module (REM) using pre-trained hand key point location model, principal line extraction module (PLEM) using deep edge detection model, and phenotype classifier (PC) based on ResNet34 network. Compared with the current ROI extraction method, our extraction is competitive with a success rate of 95.2%. For principal line extraction, the similarity score between our extracted lines and ground truth palmprint lines achieves 0.813. And the proposed architecture achieves a phenotype classification accuracy of 95.7% based on our self-built palmprint dataset CAS_Palm.</p>","PeriodicalId":74435,"journal":{"name":"Phenomics (Cham, Switzerland)","volume":"2 4","pages":"219-229"},"PeriodicalIF":3.7000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9590507/pdf/43657_2022_Article_63.pdf","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Phenomics (Cham, Switzerland)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s43657-022-00063-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"GENETICS & HEREDITY","Score":null,"Total":0}
引用次数: 1

Abstract

Palmprints are of long practical and cultural interest. Palmprint principal lines, also called primary palmar lines, are one of the most dominant palmprint features and do not change over the lifespan. The existing methods utilize filters and edge detection operators to get the principal lines from the palm region of interest (ROI), but can not distinguish the principal lines from fine wrinkles. This paper proposes a novel deep-learning architecture to extract palmprint principal lines, which could greatly reduce the influence of fine wrinkles, and classify palmprint phenotypes further from 2D palmprint images. This architecture includes three modules, ROI extraction module (REM) using pre-trained hand key point location model, principal line extraction module (PLEM) using deep edge detection model, and phenotype classifier (PC) based on ResNet34 network. Compared with the current ROI extraction method, our extraction is competitive with a success rate of 95.2%. For principal line extraction, the similarity score between our extracted lines and ground truth palmprint lines achieves 0.813. And the proposed architecture achieves a phenotype classification accuracy of 95.7% based on our self-built palmprint dataset CAS_Palm.

Abstract Image

Abstract Image

基于深度学习的掌纹表型特征提取与分类。
掌纹具有长期的实用性和文化性。掌纹主纹,也被称为初级掌纹,是掌纹最主要的特征之一,在一生中不会改变。现有方法利用滤波器和边缘检测算子从感兴趣的掌纹区域提取主纹,但无法区分主纹和细纹。本文提出了一种新的掌纹主线提取的深度学习架构,可以极大地降低细微皱纹的影响,并进一步从二维掌纹图像中对掌纹表型进行分类。该架构包括三个模块,ROI提取模块(REM)使用预训练的手部关键点定位模型,主线提取模块(PLEM)使用深度边缘检测模型,表型分类器(PC)基于ResNet34网络。与现有的ROI提取方法相比,我们的提取成功率为95.2%,具有一定的竞争力。对于主线提取,我们提取的主线与ground truth掌纹线的相似度得分达到0.813。基于我们自建的掌纹数据集CAS_Palm,所提出的架构实现了95.7%的表型分类准确率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信