{"title":"Inheritable Fisher vector feature for kinship verification","authors":"Qingfeng Liu, Ajit Puthenputhussery, Chengjun Liu","doi":"10.1109/BTAS.2015.7358768","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358768","url":null,"abstract":"An innovative inheritable Fisher vector feature (IFVF) method is presented in this paper for kinship verification. Specifically, Fisher vector is first derived for each image by aggregating the densely sampled SIFT features in the opponent color space. Second, a new inheritable transformation, which maximizes the similarity between kinship images while minimizes that between non-kinship images for each image pair simultaneously, is learned based on the Fisher vectors. As a result, the IFVF is derived by applying the inheritable transformation on the Fisher vector for each image. Finally, a novel fractional power cosine similarity measure, which shows its theoretical roots in the Bayes decision rule for minimum error, is proposed for kinship verification. Experimental results on two representative kinship data sets, namely the KinFaceW-I and the KinFaceW-II data sets, show the feasibility of the proposed method.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116703762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Robust biometrics recognition using joint weighted dictionary learning and smoothed L0 norm","authors":"R. Khorsandi, A. Taalimi, M. Abdel-Mottaleb","doi":"10.1109/BTAS.2015.7358792","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358792","url":null,"abstract":"In this paper, we present an automated system for robust biometric recognition based upon sparse representation and dictionary learning. In sparse representation, extracted features from the training data are used to develop a dictionary. Classification is achieved by representing the extracted features of the test data as a linear combination of entries in the dictionary. Dictionary learning for sparse representation has shown to improve the results in classification and recognition tasks since class labels can be used in obtaining the atoms of learnt dictionary. We propose a joint weighted dictionary learning which simultaneously learns from a set of training samples an over complete dictionary along with weight vectors that correspond to the atoms in the learnt dictionary. The components of the weight vector associated with an atom represent the relationship between the atom and each of the classes. The weight vectors and atoms are jointly obtained during the dictionary learning. In the proposed method, a constraint is imposed on the correlation between the obtained atoms that represent different classes to decrease the similarity between these atoms. In addition, we use smoothed L0 norm which is a fast algorithm to find the sparsest solution. Experiments conducted on the West Virginia University (WVU) and the University of Notre Dame (UND) datasets for ear recognition show that the proposed method outperforms other state-of-the-art classifiers.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126123680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ECG biometric authentication using a dynamical model","authors":"Abhijit Sarkar, A. L. Abbott, Zachary R. Doerzaph","doi":"10.1109/BTAS.2015.7358757","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358757","url":null,"abstract":"This paper concerns the authentication of individuals through analysis of electrocardiogram (ECG) signals. Because the human heart differs physiologically from one person to the next, ECG signals represent a rich source of information that offers strong potential for authentication or identification. We describe a novel approach to ECG-based biometrics in which a dynamical-systems model is employed, resulting in improved registration of pulses as compared to previous techniques. Parameters at the fiducial points are detected using a sum-of-Gaussians representation, resulting in an 18-component feature vector that can be used for classification. Using a publicly available dataset of ECG signals from 47 participants, a classifier was formulated using quadratic discriminant analysis (QDA). The observed mean authentication accuracies were 90% and 97% using 100 beats and 300 beats, respectively. Although tested with standard ECG signals only, we believe that the approach can be extended to other sensor types, such as fingertip-ECG devices.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126613539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pengfei Dou, Lingfeng Zhang, Yuhang Wu, S. Shah, I. Kakadiaris
{"title":"Pose-robust face signature for multi-view face recognition","authors":"Pengfei Dou, Lingfeng Zhang, Yuhang Wu, S. Shah, I. Kakadiaris","doi":"10.1109/BTAS.2015.7358788","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358788","url":null,"abstract":"Despite the great progress achieved in unconstrained face recognition, pose variations still remain a challenging and unsolved practical issue. We propose a novel framework for multi-view face recognition based on extracting and matching pose-robust face signatures from 2D images. Specifically, we propose an efficient method for monocular 3D face reconstruction, which is used to lift the 2D facial appearance to a canonical texture space and estimate the self-occlusion. On the lifted facial texture we then extract various local features, which are further enhanced by the occlusion encodings computed on the self-occlusion mask, resulting in a pose-robust face signature, a novel feature representation of the original 2D facial image. Extensive experiments on two public datasets demonstrate that our method not only simplifies the matching of multi-view 2D facial images by circumventing the requirement for pose-adaptive classifiers, but also achieves superior performance.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"400 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122721128","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spoofing key-press latencies with a generative keystroke dynamics model","authors":"John V. Monaco, M. Ali, C. Tappert","doi":"10.1109/BTAS.2015.7358795","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358795","url":null,"abstract":"This work provides strong empirical evidence for a two-state generative model of typing behavior in which the user can be in either a passive or active state. Given key-press latencies with missing key names, the model is then used to spoof the key-press latencies of a user by exploiting the scaling behavior between inter-key distance and key-press latency. Key-press latencies with missing key names can be remotely obtained over a network by observing traffic from an interactive application, such as SSH in interactive mode. The proposed generative model uses this partial information to perform a key-press-only sample-level attack on a victim's keystroke dynamics template. Results show that some users are more susceptible to this type of attack than others. For about 10% of users, the spoofed samples obtain classifier output scores of at least 50% of those obtained by authentic samples. With at least 50 observed keystrokes, the chance of success over a zero-effort attack doubles on average.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128605336","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Vera-Rodríguez, Julian Fierrez, J. Ortega-Garcia, A. Acien, Rubén Tolosana
{"title":"e-BioSign tool: Towards scientific assessment of dynamic signatures under forensic conditions","authors":"R. Vera-Rodríguez, Julian Fierrez, J. Ortega-Garcia, A. Acien, Rubén Tolosana","doi":"10.1109/BTAS.2015.7358756","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358756","url":null,"abstract":"This paper presents a new tool specifically designed to carry out dynamic signature forensic analysis and give scientific support to forensic handwriting examiners (FHEs). Traditionally FHEs have performed forensic analysis of paper-based signatures for court cases, but with the rapid evolution of the technology, nowadays they are being asked to carry out analysis based on signatures acquired by digitizing tablets more and more often. In some cases, an option followed has been to obtain a paper impression of these signatures and carry out a traditional analysis, but there are many deficiencies in this approach regarding the low spatial resolution of some devices compared to original off-line signatures and also the fact that the dynamic information, which has been proved to be very discriminative by the biometric community, is lost and not taken into account at all. The tool we present in this paper allows the FHEs to carry out a forensic analysis taking into account both the traditional off-line information normally used in paper-based signature analysis, and also the dynamic information of the signatures. Additionally, the tool incorporates two important functionalities, the first is the provision of statistical support to the analysis by including population statistics for genuine and forged signatures for some selected features, and the second is the incorporation of an automatic dynamic signature matcher, from which a likelihood ratio (LR) can be obtained from the matching comparison between the known and questioned signatures under analysis.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133907948","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Acquiring high-resolution face images in outdoor environments: A master-slave calibration algorithm","authors":"J. Neves, J. Moreno, Silvio Barra, Hugo Proença","doi":"10.1109/BTAS.2015.7358744","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358744","url":null,"abstract":"Facial recognition at-a-distance in surveillance scenarios remains an open problem, particularly due to the small number of pixels representing the facial region. The use of pan-tilt-zoom (PTZ) cameras has been advocated to solve this problem, however, the existing approaches either rely on rough approximations or additional constraints to estimate the mapping between image coordinates and pan-tilt parameters. In this paper, we aim at extending PTZ-assisted facial recognition to surveillance scenarios by proposing a master-slave calibration algorithm capable of accurately estimating pan-tilt parameters without depending on additional constraints. Our approach exploits geometric cues to automatically estimate subjects height and thus determine their 3D position. Experimental results show that the presented algorithm is able to acquire high-resolution face images at a distance ranging from 5 to 40 meters with high success rate. Additionally, we certify the applicability of the aforementioned algorithm to biometric recognition through a face recognition test, comprising 20 probe subjects and 13,020 gallery subjects.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"8 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132636052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Saripalle, Adam McLaughlin, R. Krishna, A. Ross, R. Derakhshani
{"title":"Post-mortem iris biometric analysis in Sus scrofa domesticus","authors":"S. Saripalle, Adam McLaughlin, R. Krishna, A. Ross, R. Derakhshani","doi":"10.1109/BTAS.2015.7358789","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358789","url":null,"abstract":"Although biometric utility of ante-mortem human iris tissue has been long established, post-mortem study of human iris tissue for its biometric utility has only been speculated. Given obstacles in measuring and analyzing biometric capability of post-mortem human iris tissue, an investigation into the feasibility of using post-mortem Sus scrofa domesticus iris tissue as a biometric is undertaken. The contributions of our work are two-fold: first, our method discusses a feasible alternative to human iris for study of post-mortem iris biometric analysis. Second, we report the performance of iris biometrics over a period of time after death. Previous studies have only reported qualitative changes in iris after death while for the first time we measure the biometric capacity of post-mortem iris tissue.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127865272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On humanoid robots imitating human touch gestures on the smart phone","authors":"Sujit Poudel, Abdul Serwadda, V. Phoha","doi":"10.1109/BTAS.2015.7358781","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358781","url":null,"abstract":"We showcase an attack in which an autonomous humanoid robot is trained to execute touch gestures that match those of a target user. Different from past work which addressed a similar problem using a Lego robot, we harness the significant processing power and unique motoric capabilities of the autonomous humanoid robot to implement an attack that: (1) executes touch gestures with high precision, (2) is easily adapted to execute gestures on different touch screen devices, and (3) requires minimal human involvement. Relative to the traditional zero-effort impostor attacks, we show, based on a dataset of 26 users, that our attack significantly degrades the performance of touch-based authentication systems. In addition to the paper highlighting the threat that sophisticated adversaries pose to touch-based authentication systems, our robotic attack design provides a blueprint for much needed impostor testing mechanisms that simulate algorithmic (or sophisticated) adversaries against touch-based authentication systems.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125747258","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Chahar, Shivangi Yadav, Ishan Nigam, Richa Singh, Mayank Vatsa
{"title":"A Leap Password based verification system","authors":"A. Chahar, Shivangi Yadav, Ishan Nigam, Richa Singh, Mayank Vatsa","doi":"10.1109/BTAS.2015.7358745","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358745","url":null,"abstract":"Recent developments in three-dimensional sensing devices has led to the proposal of a number of biometric modalities for non-critical scenarios. Leap Motion device has received attention from Vision and Biometrics community due to its high precision tracking. In this research, we propose Leap Password; a novel approach for biometric authentication. The Leap Password consists of a string of successive gestures performed by the user during which physiological as well as behavioral information is captured. The Conditional Mutual Information Maximization algorithm selects the optimal feature set from the extracted information. Match-score fusion is performed to reconcile information from multiple classifiers. Experiments are performed on the Leap Password Dataset, which consists of over 1700 samples obtained from 150 subjects. An accuracy of over 81% is achieved, which shows the effectiveness of the proposed approach.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127465878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}