Javier Martínez Llamas, Koen Vranckaert, Davy Preuveneers, Wouter Joosen
{"title":"Balancing Security and Privacy: Web Bot Detection, Privacy Challenges, and Regulatory Compliance under the GDPR and AI Act.","authors":"Javier Martínez Llamas, Koen Vranckaert, Davy Preuveneers, Wouter Joosen","doi":"10.12688/openreseurope.19347.1","DOIUrl":null,"url":null,"abstract":"<p><p>This paper presents a comprehensive analysis of web bot activity, exploring both offensive and defensive perspectives within the context of modern web infrastructure. As bots play a dual role-enabling malicious activities like credential stuffing and scraping while also facilitating benign automation-distinguishing between humans, good bots, and bad bots has become increasingly critical. We examine the technical challenges of detecting web bots amidst large volumes of benign traffic, highlighting the privacy risks involved in monitoring users at scale. Additionally, the study dives into the use of Privacy Enhancing Technologies (PETs) to strike a balance between bot detection and user privacy. These technologies provide innovative approaches to minimising data exposure while maintaining the effectiveness of bot-detection mechanisms. Furthermore, we explore the legal and ethical considerations associated with bot detection, mapping the technical solutions to the regulatory frameworks set forth by the EU General Data Protection Regulation (GDPR) and the Artificial Intelligence Act (AI Act). By analysing these regulatory constraints, we provide insights into how organisations can ensure compliance while maintaining robust bot defence strategies, fostering a responsible approach to cybersecurity in a privacy-conscious world.</p>","PeriodicalId":74359,"journal":{"name":"Open research Europe","volume":"5 ","pages":"76"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11962364/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Open research Europe","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12688/openreseurope.19347.1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper presents a comprehensive analysis of web bot activity, exploring both offensive and defensive perspectives within the context of modern web infrastructure. As bots play a dual role-enabling malicious activities like credential stuffing and scraping while also facilitating benign automation-distinguishing between humans, good bots, and bad bots has become increasingly critical. We examine the technical challenges of detecting web bots amidst large volumes of benign traffic, highlighting the privacy risks involved in monitoring users at scale. Additionally, the study dives into the use of Privacy Enhancing Technologies (PETs) to strike a balance between bot detection and user privacy. These technologies provide innovative approaches to minimising data exposure while maintaining the effectiveness of bot-detection mechanisms. Furthermore, we explore the legal and ethical considerations associated with bot detection, mapping the technical solutions to the regulatory frameworks set forth by the EU General Data Protection Regulation (GDPR) and the Artificial Intelligence Act (AI Act). By analysing these regulatory constraints, we provide insights into how organisations can ensure compliance while maintaining robust bot defence strategies, fostering a responsible approach to cybersecurity in a privacy-conscious world.