Pratick Ghosh, Devjyoti Saha, Diptangshu De, Sourish Sengupta, Tripti Majumdar
{"title":"基于2-CNN框架的人脸和手掌图像性别自动检测","authors":"Pratick Ghosh, Devjyoti Saha, Diptangshu De, Sourish Sengupta, Tripti Majumdar","doi":"10.1109/ICCE50343.2020.9290662","DOIUrl":null,"url":null,"abstract":"Gender is the most fundamental demographic feature of human beings. Human gender classification using computer vision has become a relevant aspect in a large variety of fields, extending from daily software applications to forensic science, specifically due to the certain surge in consumption of social media and social networking websites. In the past, many attempts have been made for gender classification using conventional models of Convolutional Neural Networks (CNNs) just by mere extraction of features (from the faces only) and classification (CNNs are capable of both). In this paper, we propose a method to classify genders using two different conventional 3-layered CNN models where one uses the facial features and the other uses the palm features of a human for gender classification. This method not only delivers better accuracy than the past single CNN model framework but also achieves the goal of a two-step verification process. We have trained the face model using one publicly available dataset that we have gathered from the online dataset repository Kaggle and we have trained the palm model using another dataset that was constructed by us in consideration of this model. On testing, the proposed framework has shown significant accuracy growth.","PeriodicalId":421963,"journal":{"name":"2020 IEEE 1st International Conference for Convergence in Engineering (ICCE)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic Detection of Gender from Face and Palm Images using a 2-CNN Framework\",\"authors\":\"Pratick Ghosh, Devjyoti Saha, Diptangshu De, Sourish Sengupta, Tripti Majumdar\",\"doi\":\"10.1109/ICCE50343.2020.9290662\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gender is the most fundamental demographic feature of human beings. Human gender classification using computer vision has become a relevant aspect in a large variety of fields, extending from daily software applications to forensic science, specifically due to the certain surge in consumption of social media and social networking websites. In the past, many attempts have been made for gender classification using conventional models of Convolutional Neural Networks (CNNs) just by mere extraction of features (from the faces only) and classification (CNNs are capable of both). In this paper, we propose a method to classify genders using two different conventional 3-layered CNN models where one uses the facial features and the other uses the palm features of a human for gender classification. This method not only delivers better accuracy than the past single CNN model framework but also achieves the goal of a two-step verification process. We have trained the face model using one publicly available dataset that we have gathered from the online dataset repository Kaggle and we have trained the palm model using another dataset that was constructed by us in consideration of this model. On testing, the proposed framework has shown significant accuracy growth.\",\"PeriodicalId\":421963,\"journal\":{\"name\":\"2020 IEEE 1st International Conference for Convergence in Engineering (ICCE)\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 1st International Conference for Convergence in Engineering (ICCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCE50343.2020.9290662\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 1st International Conference for Convergence in Engineering (ICCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCE50343.2020.9290662","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automatic Detection of Gender from Face and Palm Images using a 2-CNN Framework
Gender is the most fundamental demographic feature of human beings. Human gender classification using computer vision has become a relevant aspect in a large variety of fields, extending from daily software applications to forensic science, specifically due to the certain surge in consumption of social media and social networking websites. In the past, many attempts have been made for gender classification using conventional models of Convolutional Neural Networks (CNNs) just by mere extraction of features (from the faces only) and classification (CNNs are capable of both). In this paper, we propose a method to classify genders using two different conventional 3-layered CNN models where one uses the facial features and the other uses the palm features of a human for gender classification. This method not only delivers better accuracy than the past single CNN model framework but also achieves the goal of a two-step verification process. We have trained the face model using one publicly available dataset that we have gathered from the online dataset repository Kaggle and we have trained the palm model using another dataset that was constructed by us in consideration of this model. On testing, the proposed framework has shown significant accuracy growth.