Ramón Blanco-Gonzalo, R. Sánchez-Reillo, J. Liu-Jimenez, Carlos Sanchez-Redondo
{"title":"How to assess user interaction effects in Biometric performance","authors":"Ramón Blanco-Gonzalo, R. Sánchez-Reillo, J. Liu-Jimenez, Carlos Sanchez-Redondo","doi":"10.1109/ISBA.2017.7947699","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947699","url":null,"abstract":"Since biometric recognition systems are part of peoples' daily life, manufacturers are more concerned about the data subject interaction and the effects that low usability may have in final users. Why people may not use swipe fingerprint sensors in smartphones? Now that biometric recognition is also used in mobile devices, this influence is higher as the variability of biometrics uses is also higher. The interaction is not fixed anymore, nor scenarios or positions. People unlock their smartphones with the face at night or make payments with the fingertip in a bar. Enrollments are not under control and no proper guides are given to data subjects. All these facts point out the necessity of evaluate the effects of the interaction on the biometric performance as a major factor of influence. This work introduces an evaluation methodology proposed as standard within the ISO/IEC/JTC1/SC37 - Biometrics and its development, justifying the design criteria based on previous experimentation and further works.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129716911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Pain expression as a biometric: Why patients' self-reported pain doesn't match with the objectively measured pain?","authors":"M. A. Haque, Kamal Nasrollahi, T. Moeslund","doi":"10.1109/ISBA.2017.7947690","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947690","url":null,"abstract":"Developing a vision-based efficient and automatic pain intensity measurement system requires the understanding of the relationship between self-reported pain intensity and pain expression in the facial videos. In this paper, we first demonstrate how pain expression in facial video frames may not match with the self-reported score. This is because the pain and non-pain frames are not always visually distinctive; though the self-report tells different story of having pain and non-pain status. On the other hand previous studies reported that general facial expressions can be used as biometrics. Thus, in this paper we investigated the relevance of pain expression from facial video to be used as a biometric or soft-biometric trait. In order to do that, we employed a biometric person recognition scenario by using features obtained from the pain expression pattern found in the temporal axis of subjects' videos. The results confirmed that the pain expression patterns have distinctive features between the subjects of the UNBC McMaster shoulder pain database. We concluded that as the pain expression patterns have subjective features as a biometric, this can also cause the difference between self-reported pain level and the visually observed pain intensity level.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129392341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Transfer learning in long-text keystroke dynamics","authors":"Hayreddin Çeker, S. Upadhyaya","doi":"10.1109/ISBA.2017.7947710","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947710","url":null,"abstract":"Conventional machine learning algorithms based on keystroke dynamics build a classifier from labeled data in one or more sessions but assume that the dataset at the time of verification exhibits the same distribution. Ideally, the keystroke data collected at a session is expected to be an invariant representation of an individual's behavioral biometrics. In real applications, however, the data is sensitive to several factors such as emotion, time of the day and keyboard layout. A user's typing characteristics may gradually change over time and space. Therefore, a traditional classifier may perform poorly on another dataset that is acquired under different environmental conditions. In this paper, we apply two transfer learning techniques on long-text data to update a classifier according to the changing environmental conditions with minimum amount of re-training. We show that by using adaptive techniques, it is possible to identify an individual at a different time by acquiring only a few samples from another session, and at the same time obtain up to 19% higher accuracy relative to the traditional classifiers. We make a comparative analysis among the proposed algorithms and report the results with and without the knowledge transfer. At the end, we conclude that adaptive classifiers exhibit a higher start by a good approximation and perform better than the classifiers trained from scratch.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127827422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Houjun Huang, Shilei Liu, He Zheng, Liao Ni, Yi Zhang, Wenxin Li
{"title":"DeepVein: Novel finger vein verification methods based on Deep Convolutional Neural Networks","authors":"Houjun Huang, Shilei Liu, He Zheng, Liao Ni, Yi Zhang, Wenxin Li","doi":"10.1109/ISBA.2017.7947683","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947683","url":null,"abstract":"Finger vein verification is using vein patterns to verify a person's identity, which is widely used in various fields. In practice, the method for verification is the most important part of a biometric system, which determines the reliability of the system. In this paper, we propose methods called DeepVein for finger vein verification based on deep convolutional neural networks and conduct experiments to evaluate our methods. The experimental results show that our proposed methods can achieve state-of-the-art performance in accuracy. In addition, we present how the amount of data for training affects the accuracy in the test datasets.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133504476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Please hold on: Unobtrusive user authentication using smartphone's built-in sensors","authors":"Attaullah Buriro, B. Crispo, Yury Zhauniarovich","doi":"10.1109/ISBA.2017.7947684","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947684","url":null,"abstract":"Smartphones provide anytime-anywhere communications and are being increasingly used for a variety of purposes, e.g, sending email, performing online transactions, connecting with friends and acquaintances over social networks. As a result, a considerable amount of sensitive personal information is often generated and stored on smartphones. Thus, smartphone users may face financial as well as sentimental consequences if such information fall in the wrong hands. To address this problem all smartphones provide some form of user authentication, that is the process of verifying the user's identity. Existing authentication mechanisms, such as using 4-digit passcodes or graphical patterns, suffer from multiple limitations - they are neither highly secure nor easy to input. As a results, recent studies found that most smartphone's users do not use any authentication mechanism at all. In this paper, we present a fully unobtrusive user authentication scheme based on micro-movements of the user's hand(s) after the user unlocks her smartphone. The proposed scheme collects data from multiple 3-dimensional smartphone sensors in the background for a specific period of time and profiles a user based on the collected hand(s) movement patterns. Subsequently, the system matches the query pattern with the pre-stored patterns to authenticate the smartphone owner. Our system achieved a True Acceptance Rate (TAR) of 96% at an Equal Error Rate (EER) of 4%, on a dataset of 31 qualified volunteers (53, in total), using Random Forest (RF) classifier. Our scheme can be used as a primary authentication mechanism or can be used as a secondary authentication scheme in conjunction with any of the existing authentication schemes, e.g., passcodes, to improve their security.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130262769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cunjian Chen, A. Dantcheva, Thomas Swearingen, A. Ross
{"title":"Spoofing faces using makeup: An investigative study","authors":"Cunjian Chen, A. Dantcheva, Thomas Swearingen, A. Ross","doi":"10.1109/ISBA.2017.7947686","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947686","url":null,"abstract":"Makeup can be used to alter the facial appearance of a person. Previous studies have established the potential of using makeup to obfuscate the identity of an individual with respect to an automated face matcher. In this work, we analyze the potential of using makeup for spoofing an identity, where an individual attempts to impersonate another person's facial appearance. In this regard, we first assemble a set of face images downloaded from the internet where individuals use facial cosmetics to impersonate celebrities. We next determine the impact of this alteration on two different face matchers. Experiments suggest that automated face matchers are vulnerable to makeup-induced spoofing and that the success of spoofing is impacted by the appearance of the impersonator's face and the target face being spoofed. Further, an identification experiment is conducted to show that the spoofed faces are successfully matched at better ranks after the application of makeup. To the best of our knowledge, this is the first work that systematically studies the impact of makeup-induced face spoofing on automated face recognition.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121390769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multi-source approach for crowd density estimation in still images","authors":"Sonu Lamba, N. Nain","doi":"10.1109/ISBA.2017.7947682","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947682","url":null,"abstract":"Estimation of people density in intensely dense crowded scenes is very crucial due to perspective difference, few pixels per target, clutter and complex backgrounds etc. Most of the existing work is unable to handle the crowds of hundreds or thousands. At this level of density, one feature is not enough to estimate the total density of an image. We propose a hybrid model which relies on multiple source of information as Fourier analysis, Local binary pattern, Gray level dependence matrix (GLDM) features and Histogram of oriented gradient (HOG) for head detection to estimate the total count. Each of these features separately contribute in final total count estimation along with other statistical measures. Our approach is tested on hundred images of dense crowd annotated with 87K individuals. Experiential results validate the performance of our proposed approach by computing the total count with respect to ground truths.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127591835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A practical evaluation of free-text keystroke dynamics","authors":"Jiaju Huang, Daqing Hou, S. Schuckers","doi":"10.1109/ISBA.2017.7947695","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947695","url":null,"abstract":"Free text keystroke dynamics is a behavioral biometric that has the strong potential to offer unobtrusive and continuous user authentication. Unfortunately, due to the limited data availability, free text keystroke dynamics have not been tested adequately. Based on a novel large dataset of free text keystrokes from our ongoing data collection using behavior in natural settings, we present the first study to evaluate keystroke dynamics while respecting the temporal order of the data. Specifically, we evaluate the performance of different ways of forming a test sample using sessions, as well as a form of continuous authentication that is based on a sliding window on the keystroke time series. Instead of accumulating a new test sample of keystrokes, we update the previous sample with keystrokes that occur in the immediate past sliding window of n minutes. We evaluate sliding windows of 1 to 5, 10, and 30 minutes. Our best performer using a sliding window of 1 minute, achieves an FAR of 1% and an FRR of 11.5%. Lastly, we evaluate the sensitivity of the keystroke dynamics algorithm to short quick insider attacks that last only several minutes, by artificially injecting different portions of impostor keystrokes into the genuine test samples. For example, the evaluated algorithm is found to be able to detect insider attacks that last 2.5 minutes or longer, with a probability of 98.4%.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126701373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effects of meteorological factors on finger vein recognition","authors":"He Zheng, Qiantong Xu, Yapeng Ye, Wenxin Li","doi":"10.1109/ISBA.2017.7947696","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947696","url":null,"abstract":"Finger vein recognition is a biometric method utilizing the vein patterns inside one's fingers for personal identification. Current recognition algorithms use completely image-based feature extraction and matching technology. However, we notice that the performance of the recognition system varies when users are exposed to different meteorological environments. Researchers have proven that many environmental factors may affect the vessel and blood flow, which could directly change our finger vein patterns. In this paper, we study the impact of different meteorological factors such as temperature, humidity, atmospheric pressure, and wind, on the state-of-the-art recognition algorithms. Based on the experiment results, we find that temperature is the most significant factor. To further study the influence of temperature, we first collect finger vein samples from a group of subjects under different environment temperatures, while keeping all the other factors fixed. Then we conduct experiments to provide evidence to confirm the impact of temperature difference on both the recognition algorithm and our recognition system. Finally, we propose two methods, dynamic template selection and threshold adjustment, to reduce such impact from temperature. The experimental result proves the effectiveness of our methods.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125928460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analysis of adaptability of deep features for verifying blurred and cross-resolution images","authors":"Prithviraj Dhar, A. Alavi","doi":"10.1109/ISBA.2017.7947679","DOIUrl":"https://doi.org/10.1109/ISBA.2017.7947679","url":null,"abstract":"Employing Convolutional Neural Networks (CNN) to extract deep features from facial images for the task of recognition, identification and verification is well established. However, features extracted using CNNs have not been thoroughly studied for cross-resolution and blurred face verification. In this paper, we investigate the effectiveness of CNN features, that are primarily trained for matching high resolution images, to verify a pair of images constructed from a high and a low resolution face images. To perform this task, we degrade the image quality of the probe by artificially blurring and down sampling them, before it is passed to the CNN to be verified against high-resolution gallery image. After thorough experimental analysis, we present a pipeline which successfully improves upon the results obtained by raw CNN features, without any prior information of the quality of the degraded probe image. Using this pipeline, we show that the proposed system leads to improving verification accuracy in LFW and CMU-PIE datasets.","PeriodicalId":436086,"journal":{"name":"2017 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA)","volume":"181 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115587074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}