{"title":"Emotion Specific Human Face Authentication Based on Infrared Thermal Image","authors":"Mohammad Alamgir Hossain, Basem Assiri","doi":"10.1109/ICCIS49240.2020.9257683","DOIUrl":null,"url":null,"abstract":"Facial emotion authentication is an emergent topic during the last few decades. It depicts a human's mood and reflects his activity that he is doing, going to do and thinking to do. Activity-mapping is possible to establish by analyzing emotions' sequence from the classification of facial expressions. To identify and recognize emotion the whole face is divided into four classes and into eight regions namely forehead (left, right), eyes (left, right), lips (left, right), and chin (left, right). However, the importance is being given to the region of interest (ROI). Based on the ROI four regions are been chosen (Nose-tip, Left-eye, Right-eye and Lip). Once classification and recognition are completed a database termed as image-data-mask is maintained. The correlation between variances and standard deviations is established based on one identified image. In the process of classification and recognition, and Optimized Probability Density Function (OPDF) is proposed. The centralized database (image-data-mask) is being checked before registration of a new image into the system to avoid redundancy. Nose-tip is taken as the central-point and rest regions are being detected based on it. In this investigation, the emotions (normal, fear, and smiley) are considered and the infrared thermal images are also recorded concurrently. A calibration technique is implemented to establish a matching between vectors of face-ROI and its features. The investigational result illustrates the supremacy of the proposed method as compared to other investigators.","PeriodicalId":425637,"journal":{"name":"2020 2nd International Conference on Computer and Information Sciences (ICCIS)","volume":"205 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd International Conference on Computer and Information Sciences (ICCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIS49240.2020.9257683","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Facial emotion authentication is an emergent topic during the last few decades. It depicts a human's mood and reflects his activity that he is doing, going to do and thinking to do. Activity-mapping is possible to establish by analyzing emotions' sequence from the classification of facial expressions. To identify and recognize emotion the whole face is divided into four classes and into eight regions namely forehead (left, right), eyes (left, right), lips (left, right), and chin (left, right). However, the importance is being given to the region of interest (ROI). Based on the ROI four regions are been chosen (Nose-tip, Left-eye, Right-eye and Lip). Once classification and recognition are completed a database termed as image-data-mask is maintained. The correlation between variances and standard deviations is established based on one identified image. In the process of classification and recognition, and Optimized Probability Density Function (OPDF) is proposed. The centralized database (image-data-mask) is being checked before registration of a new image into the system to avoid redundancy. Nose-tip is taken as the central-point and rest regions are being detected based on it. In this investigation, the emotions (normal, fear, and smiley) are considered and the infrared thermal images are also recorded concurrently. A calibration technique is implemented to establish a matching between vectors of face-ROI and its features. The investigational result illustrates the supremacy of the proposed method as compared to other investigators.