M. Bispham, Clara Zard, S. Sattar, Xavier Ferrer Aran, Guillermo Suarez-Tangil, J. Such
{"title":"敏感信息泄露给第三方语音应用","authors":"M. Bispham, Clara Zard, S. Sattar, Xavier Ferrer Aran, Guillermo Suarez-Tangil, J. Such","doi":"10.1145/3543829.3544520","DOIUrl":null,"url":null,"abstract":"In this paper we investigate the issue of sensitive information leakage to third-party voice applications in voice assistant ecosystems. We focus specifically on leakage of sensitive information via the conversational interface. We use a bespoke testing infrastructure to investigate leakage of sensitive information via the conversational interface of Google Actions and Alexa Skills. Our work augments prior work in this area to consider not only specific categories of personal data, but also other types of potentially sensitive information that may be disclosed in voice-based interactions with third-party voice applications. Our findings indicate that current privacy and security measures for third-party voice applications are not sufficient to prevent leakage of all types of sensitive information via the conversational interface. We make key recommendations for the redesign of voice assistant architectures to better prevent leakage of sensitive information via the conversational interface of third-party voice applications in the future.","PeriodicalId":138046,"journal":{"name":"Proceedings of the 4th Conference on Conversational User Interfaces","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Leakage of Sensitive Information to Third-Party Voice Applications\",\"authors\":\"M. Bispham, Clara Zard, S. Sattar, Xavier Ferrer Aran, Guillermo Suarez-Tangil, J. Such\",\"doi\":\"10.1145/3543829.3544520\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we investigate the issue of sensitive information leakage to third-party voice applications in voice assistant ecosystems. We focus specifically on leakage of sensitive information via the conversational interface. We use a bespoke testing infrastructure to investigate leakage of sensitive information via the conversational interface of Google Actions and Alexa Skills. Our work augments prior work in this area to consider not only specific categories of personal data, but also other types of potentially sensitive information that may be disclosed in voice-based interactions with third-party voice applications. Our findings indicate that current privacy and security measures for third-party voice applications are not sufficient to prevent leakage of all types of sensitive information via the conversational interface. We make key recommendations for the redesign of voice assistant architectures to better prevent leakage of sensitive information via the conversational interface of third-party voice applications in the future.\",\"PeriodicalId\":138046,\"journal\":{\"name\":\"Proceedings of the 4th Conference on Conversational User Interfaces\",\"volume\":\"52 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 4th Conference on Conversational User Interfaces\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3543829.3544520\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th Conference on Conversational User Interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3543829.3544520","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Leakage of Sensitive Information to Third-Party Voice Applications
In this paper we investigate the issue of sensitive information leakage to third-party voice applications in voice assistant ecosystems. We focus specifically on leakage of sensitive information via the conversational interface. We use a bespoke testing infrastructure to investigate leakage of sensitive information via the conversational interface of Google Actions and Alexa Skills. Our work augments prior work in this area to consider not only specific categories of personal data, but also other types of potentially sensitive information that may be disclosed in voice-based interactions with third-party voice applications. Our findings indicate that current privacy and security measures for third-party voice applications are not sufficient to prevent leakage of all types of sensitive information via the conversational interface. We make key recommendations for the redesign of voice assistant architectures to better prevent leakage of sensitive information via the conversational interface of third-party voice applications in the future.