Constitutional implications of judicial review on the use of facial recognition technology by the police in UK

Song-Ok Kim
{"title":"Constitutional implications of judicial review on the use of facial recognition technology by the police in UK","authors":"Song-Ok Kim","doi":"10.24324/kiacl.2022.28.3.65","DOIUrl":null,"url":null,"abstract":"Between May 2017 and April 2019, the South Wales Police in the UK operated the system called “AFR Location” equipped with automated facial recognition technology (henceforth, AFR) for the purpose of finding criminals and missing persons in need of protection. The system is deployed CCTV cameras on police vehicles and analyzes the faces of members of the public are taken from the CCTV feeds and compares them with the faces of the Watchlist to check whether they are the same person. These systems were temporarily used at large public events, such as the Defence Exhibition, and had safeguards to control the risk of data processing, such as software management to delete data such as facial images and biometric data immediately or within 24 hours. However, a civil liberties campaigner filed a lawsuit in October 2018 to challenge the lawfulness of the South Wales Police’s use of AFR, stating the use of AFR violates his right of privacy. This was reportedly the first case to deal with the lawfulness of facial recognition technology by police in the worlds. \nThe Divisional Court dismissed the Claimant’s claim for judicial review on all grounds and ruled in favor of the South Wales Police, but the Court of Appeal affirmed the unlawfulness of the South Wales Police’s use of AFR. The main reason for the unlawfulness is the breach of the requirements of Article 8 of the European Convention on Human Rights. Article 8(1) stipulates “the right to respect private life”, and Article 8(2) provides that such rights shall be no interference by a public authority except such as is “in accordance with the law” and is necessary in a democratic society. According to this, Court of Appeal concluded that there is not a sufficient legal frameworks to properly control the use of AFR by the South Wales Police. There are not enough legal safeguards to properly control the use of AFR system, because there are not any criteria for determining where AFR can be deployed and who can be placed on the watchlist. \nThis judgment of the Court of Appeal has great implications for us. In light of the fact that Article 8(2) of the European Convention on Human Rights is similar to Article 37(2) of the Constitution of Republic of Korea, it provides implications for how the principle of rule of law should be interpreted in relation to the processing of personal and sensitive information processed by biometrics. In other words, whether or not the principle of rule of law has been uphold should be evaluated as whether there are sufficient safeguards to prevent the actual risk of abuse and arbitrary use, not just whether there is a legal basis of processing of data in the Missing Children Act, Act on the Performance of Duties by Police Officer, and the Personal Information Protection Act. In this dimension, this study attempted to suggest proposals of these Acts.","PeriodicalId":322578,"journal":{"name":"Korean Association of International Association of Constitutional Law","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Korean Association of International Association of Constitutional Law","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.24324/kiacl.2022.28.3.65","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Between May 2017 and April 2019, the South Wales Police in the UK operated the system called “AFR Location” equipped with automated facial recognition technology (henceforth, AFR) for the purpose of finding criminals and missing persons in need of protection. The system is deployed CCTV cameras on police vehicles and analyzes the faces of members of the public are taken from the CCTV feeds and compares them with the faces of the Watchlist to check whether they are the same person. These systems were temporarily used at large public events, such as the Defence Exhibition, and had safeguards to control the risk of data processing, such as software management to delete data such as facial images and biometric data immediately or within 24 hours. However, a civil liberties campaigner filed a lawsuit in October 2018 to challenge the lawfulness of the South Wales Police’s use of AFR, stating the use of AFR violates his right of privacy. This was reportedly the first case to deal with the lawfulness of facial recognition technology by police in the worlds. The Divisional Court dismissed the Claimant’s claim for judicial review on all grounds and ruled in favor of the South Wales Police, but the Court of Appeal affirmed the unlawfulness of the South Wales Police’s use of AFR. The main reason for the unlawfulness is the breach of the requirements of Article 8 of the European Convention on Human Rights. Article 8(1) stipulates “the right to respect private life”, and Article 8(2) provides that such rights shall be no interference by a public authority except such as is “in accordance with the law” and is necessary in a democratic society. According to this, Court of Appeal concluded that there is not a sufficient legal frameworks to properly control the use of AFR by the South Wales Police. There are not enough legal safeguards to properly control the use of AFR system, because there are not any criteria for determining where AFR can be deployed and who can be placed on the watchlist. This judgment of the Court of Appeal has great implications for us. In light of the fact that Article 8(2) of the European Convention on Human Rights is similar to Article 37(2) of the Constitution of Republic of Korea, it provides implications for how the principle of rule of law should be interpreted in relation to the processing of personal and sensitive information processed by biometrics. In other words, whether or not the principle of rule of law has been uphold should be evaluated as whether there are sufficient safeguards to prevent the actual risk of abuse and arbitrary use, not just whether there is a legal basis of processing of data in the Missing Children Act, Act on the Performance of Duties by Police Officer, and the Personal Information Protection Act. In this dimension, this study attempted to suggest proposals of these Acts.
英国警方使用面部识别技术的司法审查对宪法的影响
2017年5月至2019年4月,英国南威尔士警方运行了一个名为“AFR定位”的系统,该系统配备了自动面部识别技术(以下简称AFR),目的是寻找需要保护的罪犯和失踪人员。该系统在警车上安装了闭路电视摄像头,并分析从闭路电视视频中获取的公众面孔,并将其与监视名单中的面孔进行比较,以检查他们是否为同一个人。这些系统暂时用于大型公共活动,例如防务展,并且有安全措施控制数据处理的风险,例如软件管理立即或在24小时内删除数据,例如面部图像和生物特征数据。然而,一名公民自由活动家于2018年10月提起诉讼,质疑南威尔士警方使用AFR的合法性,称使用AFR侵犯了他的隐私权。据报道,这是世界上第一起涉及警方面部识别技术合法性的案件。地区法院驳回了索赔人要求司法审查的所有理由,并裁定南威尔士警方胜诉,但上诉法院确认南威尔士警方使用AFR是非法的。非法的主要原因是违反了《欧洲人权公约》第8条的要求。第8条第(1)款规定“尊重私人生活的权利”,第8条第(2)款规定这种权利不应受到公共当局的干涉,除非是“依法”和民主社会所必需的。根据这一点,上诉法院得出结论,没有足够的法律框架来适当控制南威尔士警察使用AFR。没有足够的法律保障来适当控制AFR系统的使用,因为没有任何标准来确定AFR可以部署在哪里以及谁可以被列入观察名单。上诉法院的这一判决对我们有重大影响。鉴于《欧洲人权公约》第8条第(2)款与《大韩民国宪法》第37条第(2)款类似,它就如何解释与生物识别技术处理的个人和敏感信息有关的法治原则提供了影响。换句话说,是否坚持了法治原则,不应该只看《失踪儿童法》、《警察履职法》、《个人信息保护法》中是否有处理数据的法律依据,而应该看是否有足够的保障措施来防止滥用和任意使用的实际风险。在这方面,本研究试图提出这些法案的建议。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信