{"title":"通过智能手机感知舌头和嘴唇运动的身份验证","authors":"Aslan B. Wong","doi":"10.1109/SECON52354.2021.9491596","DOIUrl":null,"url":null,"abstract":"Current voice-based user authentication explores the unique characteristics from either the voiceprint or mouth movements, which are at risk to replay attacks. During speaking, the vocal tract, tongue, and lip, including the static shape and dynamic movements, expose individual uniqueness, and adversaries hardly imitate them. Moreover, most voice-based user authentications are passphrase-dependent, which significantly reduces the user experience. Therefore, our work aims to employ the individual uniqueness of vocal tract, tongue, lip movement to realize user authentication on a smartphone. This paper presents a new authentication framework to identify smartphone users through articulation, namely tongue and lip motion reading. The main idea is to capture acoustic and ultrasonic signals from a mobile phone and analyze the fine-grained impact of articulation movement on the uttered words. We currently develop a passphrase-independent authentication model by analyzing the articulation in continuous speech, exploring different scenarios, and creating a passphrase-independent authentication model.","PeriodicalId":120945,"journal":{"name":"2021 18th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Authentication through Sensing of Tongue and Lip Motion via Smartphone\",\"authors\":\"Aslan B. Wong\",\"doi\":\"10.1109/SECON52354.2021.9491596\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Current voice-based user authentication explores the unique characteristics from either the voiceprint or mouth movements, which are at risk to replay attacks. During speaking, the vocal tract, tongue, and lip, including the static shape and dynamic movements, expose individual uniqueness, and adversaries hardly imitate them. Moreover, most voice-based user authentications are passphrase-dependent, which significantly reduces the user experience. Therefore, our work aims to employ the individual uniqueness of vocal tract, tongue, lip movement to realize user authentication on a smartphone. This paper presents a new authentication framework to identify smartphone users through articulation, namely tongue and lip motion reading. The main idea is to capture acoustic and ultrasonic signals from a mobile phone and analyze the fine-grained impact of articulation movement on the uttered words. We currently develop a passphrase-independent authentication model by analyzing the articulation in continuous speech, exploring different scenarios, and creating a passphrase-independent authentication model.\",\"PeriodicalId\":120945,\"journal\":{\"name\":\"2021 18th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)\",\"volume\":\"22 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 18th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SECON52354.2021.9491596\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 18th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SECON52354.2021.9491596","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Authentication through Sensing of Tongue and Lip Motion via Smartphone
Current voice-based user authentication explores the unique characteristics from either the voiceprint or mouth movements, which are at risk to replay attacks. During speaking, the vocal tract, tongue, and lip, including the static shape and dynamic movements, expose individual uniqueness, and adversaries hardly imitate them. Moreover, most voice-based user authentications are passphrase-dependent, which significantly reduces the user experience. Therefore, our work aims to employ the individual uniqueness of vocal tract, tongue, lip movement to realize user authentication on a smartphone. This paper presents a new authentication framework to identify smartphone users through articulation, namely tongue and lip motion reading. The main idea is to capture acoustic and ultrasonic signals from a mobile phone and analyze the fine-grained impact of articulation movement on the uttered words. We currently develop a passphrase-independent authentication model by analyzing the articulation in continuous speech, exploring different scenarios, and creating a passphrase-independent authentication model.