Ahmad Hassanat, V. B. Surya Prasath, Bassam M. Al-Mahadeen, Samaher Madallah Moslem Alhasanat
{"title":"蒙面分类与性别识别","authors":"Ahmad Hassanat, V. B. Surya Prasath, Bassam M. Al-Mahadeen, Samaher Madallah Moslem Alhasanat","doi":"10.1504/IJBM.2017.10009351","DOIUrl":null,"url":null,"abstract":"This study aims to investigate to what extent a computer system can identify veiled-human and recognise gender using eyes and the uncovered part of the face. For the purpose of this study, we have created a new veiled persons image (VPI) database shot using a mobile phone camera, imaging 100 different veiled-persons over two sessions. After preprocessing and segmentation we used a fused method for feature extraction. The fusion occurs between geometrical (edge ratio) and textural (probability density function of the colour moments) features. The experimental results using different classifiers were ranging from 88:63% to 97:22% for person identification accuracy before feature selection and up to 97:55% after feature selection. The proposed method achieved up to 99:41% success rate for gender classification.","PeriodicalId":262486,"journal":{"name":"Int. J. Biom.","volume":"129 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":"{\"title\":\"Classification and gender recognition from veiled-faces\",\"authors\":\"Ahmad Hassanat, V. B. Surya Prasath, Bassam M. Al-Mahadeen, Samaher Madallah Moslem Alhasanat\",\"doi\":\"10.1504/IJBM.2017.10009351\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study aims to investigate to what extent a computer system can identify veiled-human and recognise gender using eyes and the uncovered part of the face. For the purpose of this study, we have created a new veiled persons image (VPI) database shot using a mobile phone camera, imaging 100 different veiled-persons over two sessions. After preprocessing and segmentation we used a fused method for feature extraction. The fusion occurs between geometrical (edge ratio) and textural (probability density function of the colour moments) features. The experimental results using different classifiers were ranging from 88:63% to 97:22% for person identification accuracy before feature selection and up to 97:55% after feature selection. The proposed method achieved up to 99:41% success rate for gender classification.\",\"PeriodicalId\":262486,\"journal\":{\"name\":\"Int. J. Biom.\",\"volume\":\"129 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. J. Biom.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1504/IJBM.2017.10009351\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Biom.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1504/IJBM.2017.10009351","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Classification and gender recognition from veiled-faces
This study aims to investigate to what extent a computer system can identify veiled-human and recognise gender using eyes and the uncovered part of the face. For the purpose of this study, we have created a new veiled persons image (VPI) database shot using a mobile phone camera, imaging 100 different veiled-persons over two sessions. After preprocessing and segmentation we used a fused method for feature extraction. The fusion occurs between geometrical (edge ratio) and textural (probability density function of the colour moments) features. The experimental results using different classifiers were ranging from 88:63% to 97:22% for person identification accuracy before feature selection and up to 97:55% after feature selection. The proposed method achieved up to 99:41% success rate for gender classification.