{"title":"基于图像的多层前馈神经网络性别预测模型","authors":"Mohamed Yousif Elmahi, E. I. M. Zayid","doi":"10.20533/ijmip.2042.4647.2019.0056","DOIUrl":null,"url":null,"abstract":"In this study, we develop a reliable and highperformance multi-layer feed-forward artificial neural networks (MFANNs) model for predicting gender classification. The study used features for a set of 450 images randomly chosen from the FERET dataset. We extract the only high-merit candidate parameters form the FERET dataset. A discrete cosine transformation (DCT) is employed to facilitate an image description and conversion. To reach the final gender estimation model, authors examined three artificial neural classifiers and each extremely performs deep computation processes. In addition to the MFANNs, artificial neural networks (ANNs) classifiers include support vector regression with radial-basis function (SVR-RBF) and k-Nearest Neighbor (k-NN). A 10-folds cross-validation technique (CV) is used to prove the integrity of the dataset inputs and enhance the calculation process of the model. In this model, the performance criteria for accuracy rate and mean squared error (MSE) are carried out. Results of the MFANNs models are compared with the ones that obtained by SVR-RBF and k-NN. It is shown that the MFANNs model performs better (i.e. lowest MSE = 0.0789, and highest accuracy rate = 96.9%) than SVR-based and k-NN models. Linked the study findings with the results obtained in the literature review, we conclude that our method achieves a recommended calculation for gender prediction.","PeriodicalId":342220,"journal":{"name":"International Journal of Multimedia and Image Processing","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Image-Based Gender Prediction Model Using Multilayer Feed-Forward Neural Networks\",\"authors\":\"Mohamed Yousif Elmahi, E. I. M. Zayid\",\"doi\":\"10.20533/ijmip.2042.4647.2019.0056\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this study, we develop a reliable and highperformance multi-layer feed-forward artificial neural networks (MFANNs) model for predicting gender classification. The study used features for a set of 450 images randomly chosen from the FERET dataset. We extract the only high-merit candidate parameters form the FERET dataset. A discrete cosine transformation (DCT) is employed to facilitate an image description and conversion. To reach the final gender estimation model, authors examined three artificial neural classifiers and each extremely performs deep computation processes. In addition to the MFANNs, artificial neural networks (ANNs) classifiers include support vector regression with radial-basis function (SVR-RBF) and k-Nearest Neighbor (k-NN). A 10-folds cross-validation technique (CV) is used to prove the integrity of the dataset inputs and enhance the calculation process of the model. In this model, the performance criteria for accuracy rate and mean squared error (MSE) are carried out. Results of the MFANNs models are compared with the ones that obtained by SVR-RBF and k-NN. It is shown that the MFANNs model performs better (i.e. lowest MSE = 0.0789, and highest accuracy rate = 96.9%) than SVR-based and k-NN models. Linked the study findings with the results obtained in the literature review, we conclude that our method achieves a recommended calculation for gender prediction.\",\"PeriodicalId\":342220,\"journal\":{\"name\":\"International Journal of Multimedia and Image Processing\",\"volume\":\"41 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Multimedia and Image Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.20533/ijmip.2042.4647.2019.0056\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Multimedia and Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20533/ijmip.2042.4647.2019.0056","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Image-Based Gender Prediction Model Using Multilayer Feed-Forward Neural Networks
In this study, we develop a reliable and highperformance multi-layer feed-forward artificial neural networks (MFANNs) model for predicting gender classification. The study used features for a set of 450 images randomly chosen from the FERET dataset. We extract the only high-merit candidate parameters form the FERET dataset. A discrete cosine transformation (DCT) is employed to facilitate an image description and conversion. To reach the final gender estimation model, authors examined three artificial neural classifiers and each extremely performs deep computation processes. In addition to the MFANNs, artificial neural networks (ANNs) classifiers include support vector regression with radial-basis function (SVR-RBF) and k-Nearest Neighbor (k-NN). A 10-folds cross-validation technique (CV) is used to prove the integrity of the dataset inputs and enhance the calculation process of the model. In this model, the performance criteria for accuracy rate and mean squared error (MSE) are carried out. Results of the MFANNs models are compared with the ones that obtained by SVR-RBF and k-NN. It is shown that the MFANNs model performs better (i.e. lowest MSE = 0.0789, and highest accuracy rate = 96.9%) than SVR-based and k-NN models. Linked the study findings with the results obtained in the literature review, we conclude that our method achieves a recommended calculation for gender prediction.