D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu
{"title":"自动问责?隐私政策、数据透明度和第三方问题","authors":"D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu","doi":"10.3138/utlj-2020-0136","DOIUrl":null,"url":null,"abstract":"Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.","PeriodicalId":46289,"journal":{"name":"University of Toronto Law Journal","volume":"72 1","pages":"155 - 188"},"PeriodicalIF":0.7000,"publicationDate":"2021-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Automating accountability? Privacy policies, data transparency, and the third party problem\",\"authors\":\"D. Lie, Lisa M. Austin, Peter Yi Ping Sun, Wen Qiu\",\"doi\":\"10.3138/utlj-2020-0136\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.\",\"PeriodicalId\":46289,\"journal\":{\"name\":\"University of Toronto Law Journal\",\"volume\":\"72 1\",\"pages\":\"155 - 188\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2021-12-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"University of Toronto Law Journal\",\"FirstCategoryId\":\"90\",\"ListUrlMain\":\"https://doi.org/10.3138/utlj-2020-0136\",\"RegionNum\":4,\"RegionCategory\":\"社会学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"LAW\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"University of Toronto Law Journal","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.3138/utlj-2020-0136","RegionNum":4,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"LAW","Score":null,"Total":0}
Automating accountability? Privacy policies, data transparency, and the third party problem
Abstract:We have a data transparency problem. Currently, one of the main mechanisms we have to understand data flows is through the self-reporting that organizations provide through privacy policies. These suffer from many well-known problems, problems that are becoming more acute with the increasing complexity of the data ecosystem and the role of third parties – the affiliates, partners, processors, ad agencies, analytic services, and data brokers involved in the contemporary data practices of organizations. In this article, we argue that automating privacy policy analysis can improve the usability of privacy policies as a transparency mechanism. Our argument has five parts. First, we claim that we need to shift from thinking about privacy policies as a transparency mechanism that enhances consumer choice and see them as a transparency mechanism that enhances meaningful accountability. Second, we discuss a research tool that we prototyped, called AppTrans (for Application Transparency), which can detect inconsistencies between the declarations in a privacy policy and the actions the mobile application can potentially take if it is used. We used AppTrans to test seven hundred applications and found that 59.5 per cent were collecting data in ways that were not declared in their policies. The vast majority of the discrepancies were due to third party data collection such as advertising and analytics. Third, we outline the follow-on research we did to extend AppTrans to analyse the information sharing of mobile applications with third parties, with mixed results. Fourth, we situate our findings in relation to the third party issues that came to light in the recent Cambridge Analytica scandal and the calls from regulators for enhanced technical safeguards in managing these third party relationships. Fifth, we discuss some of the limitations of privacy policy automation as a strategy for enhanced data transparency and the policy implications of these limitations.