Computer Law & Security Review最新文献

筛选
英文 中文
Future themes in regulating artificial intelligence in investment management
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-03-06 DOI: 10.1016/j.clsr.2025.106111
Wojtek Buczynski , Felix Steffek , Mateja Jamnik , Fabio Cuzzolin , Barbara Sahakian
{"title":"Future themes in regulating artificial intelligence in investment management","authors":"Wojtek Buczynski ,&nbsp;Felix Steffek ,&nbsp;Mateja Jamnik ,&nbsp;Fabio Cuzzolin ,&nbsp;Barbara Sahakian","doi":"10.1016/j.clsr.2025.106111","DOIUrl":"10.1016/j.clsr.2025.106111","url":null,"abstract":"<div><div>We are witnessing the emergence of the “first generation” of AI and AI-adjacent soft and hard laws such as the EU AI Act or South Korea's Basic Act on AI. In parallel, existing industry regulations, such as GDPR, MIFID II or SM&amp;CR, are being “retrofitted” and reinterpreted from the perspective of AI. In this paper we identify and analyze ten novel, “second generation” themes which are likely to become regulatory considerations in the near future: non-personal data, managerial accountability, robo-advisory, generative AI, privacy enhancing techniques (PETs), profiling, emergent behaviours, smart contracts, ESG and algorithm management. The themes have been identified on the basis of ongoing developments in AI, existing regulations and industry discussions. Prior to making any new regulatory recommendations we explore whether novel issues can be solved by existing regulations. The contribution of this paper is a comprehensive picture of emerging regulatory considerations for AI in investment management, as well as broader financial services, and the ways they might be addressed by regulations – future or existing ones.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106111"},"PeriodicalIF":3.3,"publicationDate":"2025-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143548373","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The rise of technology courts, or: How technology companies re-invent adjudication for a digital world
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-03-05 DOI: 10.1016/j.clsr.2025.106118
Natali Helberger
{"title":"The rise of technology courts, or: How technology companies re-invent adjudication for a digital world","authors":"Natali Helberger","doi":"10.1016/j.clsr.2025.106118","DOIUrl":"10.1016/j.clsr.2025.106118","url":null,"abstract":"<div><div>The article “The Rise of Technology Courts” explores the evolving role of courts in the digital world, where technological advancements and artificial intelligence (AI) are transforming traditional adjudication processes. It argues that traditional courts are undergoing a significant transition due to digitization and the increasing influence of technology companies. The paper frames this transformation through the concept of the “sphere of the digital,” which explains how digital technology and AI redefine societal expectations of what courts should be and how they function.</div><div>The article highlights that technology is not only changing the materiality of courts—moving from physical buildings to digital portals—but also affecting their symbolic function as public institutions. It discusses the emergence of AI-powered judicial services, online dispute resolution (ODR), and technology-driven alternative adjudication bodies like the Meta Oversight Board. These developments challenge the traditional notions of judicial authority, jurisdiction, and legal expertise.</div><div>The paper concludes that while these technology-driven solutions offer increased efficiency and accessibility, they also raise fundamental questions about the legitimacy, transparency, and independence of adjudicatory bodies. As technology companies continue to shape digital justice, the article also argues that there are lessons to learn for the role and structure of traditional courts to ensure that human rights and public values are upheld.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106118"},"PeriodicalIF":3.3,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143548374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-03-03 DOI: 10.1016/j.clsr.2025.106109
Kevin Macnish
{"title":"","authors":"Kevin Macnish","doi":"10.1016/j.clsr.2025.106109","DOIUrl":"10.1016/j.clsr.2025.106109","url":null,"abstract":"","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106109"},"PeriodicalIF":3.3,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143529187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing the (severity of) impacts on fundamental rights 评估对基本权利的(严重)影响
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-28 DOI: 10.1016/j.clsr.2025.106113
Gianclaudio Malgieri , Cristiana Santos
{"title":"Assessing the (severity of) impacts on fundamental rights","authors":"Gianclaudio Malgieri ,&nbsp;Cristiana Santos","doi":"10.1016/j.clsr.2025.106113","DOIUrl":"10.1016/j.clsr.2025.106113","url":null,"abstract":"<div><div>\"Risk to fundamental rights,\", \"impact on fundamental rights\", \"harm to fundamental rights\" and \"non-material damages\" are all terms referring to similar problems, though inherently ambiguous and very problematic, especially in the age of AI-based technologies and digital platforms. Traditionally, legal and social sciences have two different approaches to analysing the impacts on fundamental rights: the rights-based approach and the risk of harm-based approach to fundamental rights. The rights-based approach is binary, focusing on whether rights and obligations are respected or violated. In contrast, a harm-based approach focuses on the anticipation of undesired events and measuring their likelihood and severity. However, focusing solely on \"harms'' or \"damages'' is reductionist, while existing impact assessment models often use vague terms like \"gravity\", \"intensity,\" and \"magnitude\", which do not effectively help measure interferences with fundamental rights. Without operational criteria to measure these risks, most EU digital strategies demanding impact and risk assessments fail. Examples include the Data Protection Impact Assessment (DPIA) in the GDPR, Fundamental Rights Impact Assessments (FRIA) in the AI Act, and systemic risk assessments in the Digital Services Act (DSA). We posit that interferences with fundamental rights are seen as a spectrum that ranges from social contacts to violations, and these interferences can and should be measured. Thus, this article proposes a rights-based approach, combining it with elements from the harm approach and proposes an actionable parameter-based framework (also based on social meaning theories and social perception methodologies) to assess impacts on fundamental rights. The proposed multi-metric approach ensures a comprehensive assessment of the <em>severity</em> of impacts on fundamental rights within EU law, particularly in GDPR, DSA, and AI Act. This approach aims to inform policymaking, prioritise high-risk scenarios and propose mitigation measures in digital markets. This is especially important for detecting and addressing human vulnerabilities in interactions with digital technologies.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106113"},"PeriodicalIF":3.3,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143520485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Asia-Pacific Developments
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-28 DOI: 10.1016/j.clsr.2025.106116
Gabriela Kennedy , Joanna Wong , Justin Lai , James North , Philip Catania , Michael do Rozario , Jack Matthews , Arun Babu , Gayathri Poti , Ishita Vats , Kiyoko Nakaoka , Lam Chung Nian , Emma Choe
{"title":"Asia-Pacific Developments","authors":"Gabriela Kennedy ,&nbsp;Joanna Wong ,&nbsp;Justin Lai ,&nbsp;James North ,&nbsp;Philip Catania ,&nbsp;Michael do Rozario ,&nbsp;Jack Matthews ,&nbsp;Arun Babu ,&nbsp;Gayathri Poti ,&nbsp;Ishita Vats ,&nbsp;Kiyoko Nakaoka ,&nbsp;Lam Chung Nian ,&nbsp;Emma Choe","doi":"10.1016/j.clsr.2025.106116","DOIUrl":"10.1016/j.clsr.2025.106116","url":null,"abstract":"<div><div>This column provides a country by country analysis of the latest legal developments, cases and issues relevant to the IT, media and telecommunications' industries in key jurisdictions across the Asia Pacific region. The articles appearing in this column are intended to serve as ‘alerts’ and are not submitted as detailed analyses of cases or legal developments.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106116"},"PeriodicalIF":3.3,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143601100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
European national news
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-25 DOI: 10.1016/j.clsr.2025.106114
Nick Pantlin
{"title":"European national news","authors":"Nick Pantlin","doi":"10.1016/j.clsr.2025.106114","DOIUrl":"10.1016/j.clsr.2025.106114","url":null,"abstract":"<div><div>This article tracks developments at the national level in key European countries in the area of IT and communications and provides a concise alerting service of important national developments. It is co-ordinated by Herbert Smith Freehills LLP and contributed to by firms across Europe. This column provides a concise alerting service of important national developments in key European countries. Part of its purpose is to complement the Journal's feature articles and briefing notes by keeping readers abreast of what is currently happening “on the ground” at a national level in implementing EU level legislation and international conventions and treaties. Where an item of European National News is of particular significance, CLSR may also cover it in more detail in the current or a subsequent edition.</div><div>© 2025 Herbert Smith Freehills LLP. Published by Elsevier Ltd. All rights reserved.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106114"},"PeriodicalIF":3.3,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143600490","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
My AI, my code, my secret – Trade secrecy, informational transparency and meaningful litigant participation under the European Union's AI Liability Directive Proposal
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-21 DOI: 10.1016/j.clsr.2025.106117
Ljupcho Grozdanovski
{"title":"My AI, my code, my secret – Trade secrecy, informational transparency and meaningful litigant participation under the European Union's AI Liability Directive Proposal","authors":"Ljupcho Grozdanovski","doi":"10.1016/j.clsr.2025.106117","DOIUrl":"10.1016/j.clsr.2025.106117","url":null,"abstract":"<div><div>In European Union (EU) law, the AI Liability Directive (AILD) proposal included a right for victims of harm caused by high-risk AI systems to request the disclosure of relevant evidence. That right is, however, limited by the protection of trade secrets. During legal proceedings, business confidentiality can indeed restrict the victims’ access to evidence, potentially precluding them from fully understanding the disputed facts and effectively making their views known before a court. This article examines whether the AILD provided sufficient procedural mechanisms to ensure that litigants can effectively participate in judicial proceedings, even when critical evidence is withheld from them, due to legitimate trade secret protections. Our analysis draws on the evidentiary challenges highlighted in emerging global AI liability cases and selected CJEU case law, which provide guidance on how a balance can be struck between legitimate confidentiality and a workable level of informational transparency, necessary for an informed and fair resolution of future AI liability disputes.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106117"},"PeriodicalIF":3.3,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143453793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Using sensitive data to de-bias AI systems: Article 10(5) of the EU AI act
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-16 DOI: 10.1016/j.clsr.2025.106115
Marvin van Bekkum
{"title":"Using sensitive data to de-bias AI systems: Article 10(5) of the EU AI act","authors":"Marvin van Bekkum","doi":"10.1016/j.clsr.2025.106115","DOIUrl":"10.1016/j.clsr.2025.106115","url":null,"abstract":"<div><div>In June 2024, the EU AI Act came into force. The AI Act includes obligations for the provider of an AI system. Article 10 of the AI Act includes a new obligation for providers to evaluate whether their training, validation and testing datasets meet certain quality criteria, including an appropriate examination of biases in the datasets and correction measures. With the obligation comes a new provision in Article 10(5) AI Act, allowing providers to collect sensitive data to fulfil the obligation. Article 10(5) AI Act aims to prevent discrimination. In this paper, I investigate the scope and implications of Article 10(5) AI Act. The paper primarily concerns European Union law, but may be relevant in other parts of the world, as policymakers aim to regulate biases in AI systems.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106115"},"PeriodicalIF":3.3,"publicationDate":"2025-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143419207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Virtual justice, or justice virtually: Navigating the challenges in China’s adoption of virtual criminal justice
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-08 DOI: 10.1016/j.clsr.2025.106112
Han Qin , Li Chen
{"title":"Virtual justice, or justice virtually: Navigating the challenges in China’s adoption of virtual criminal justice","authors":"Han Qin ,&nbsp;Li Chen","doi":"10.1016/j.clsr.2025.106112","DOIUrl":"10.1016/j.clsr.2025.106112","url":null,"abstract":"<div><div>Positioned within China’s <em>Trial Informatization</em> framework, the availability of virtual litigation has played a crucial role in enhancing access to justice. In the criminal justice system, the implementation of virtual litigation has transformed various areas, including pre-trial interviews, simplified criminal procedures, witness testimony, commutation hearings, and the reception of petitions. However, these technological advancements pose challenges to the authority, legitimacy, engagement and public deterrence aspects of criminal trials. To address these challenges, virtual litigation should be reframed as a tool to effect incremental change and be limited in application to cases where in-person hearings and other court processes are unfeasible. Further, more stringent rules need to be imposed on the finding of an implicit acceptance by accused persons to a remote trial process so as to preserve their autonomy. Courts should bear responsibility for third-party interfaces utlised as part of the criminal justice process, such as video conferencing platforms or digital document repositories. Finally, on the other side of the bench, defense counsel should have an equal say as the prosecution in determining whether a trial is conducted remotely.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106112"},"PeriodicalIF":3.3,"publicationDate":"2025-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143350252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Adverse human rights impacts of dissemination of nonconsensual sexual deepfakes in the framework of European Convention on Human Rights: A victim-centered perspective
IF 3.3 3区 社会学
Computer Law & Security Review Pub Date : 2025-02-01 DOI: 10.1016/j.clsr.2025.106108
Can Yavuz
{"title":"Adverse human rights impacts of dissemination of nonconsensual sexual deepfakes in the framework of European Convention on Human Rights: A victim-centered perspective","authors":"Can Yavuz","doi":"10.1016/j.clsr.2025.106108","DOIUrl":"10.1016/j.clsr.2025.106108","url":null,"abstract":"<div><div>Generative artificial intelligence systems have advanced significantly over the past decade and can now generate synthetic but highly realistic audio, photo, and video, commonly referred to as deepfake. Image-based sexual abuse was the first widespread (mis)use of deepfake technology and continues to be the most common form of its misuse. However, further (empirical) research is needed to examine this phenomenon's adverse human rights implications. This paper analyses the potential adverse human rights impacts of the dissemination of nonconsensual sexual deepfakes in the framework of the European Convention on Human Rights and argues that the dissemination of such deepfakes can hinder the rights protected by the Convention. These include the right to respect for private and family life, as nonconsensual sexual deepfakes can undermine data protection, harm one's image and reputation, and compromise psychological integrity and personal autonomy. Additionally, such deepfakes can threaten freedom of expression by creating a silencing effect on public watchdogs, politicians, and private individuals. Finally, nonconsensual sexual deepfakes can impair the economic and moral rights of pornography performers by abusing their work and bodies to abuse others without authorization and compensation. These findings highlight that the Council of Europe member states must fulfil their obligations to provide effective protection against this technology-facilitated, gender-based, and sexual violence.</div></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"56 ","pages":"Article 106108"},"PeriodicalIF":3.3,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143138755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信