Valerie Verdoodt, Eva Lievens, Argyro Chatzinikolaou
{"title":"The EU Approach to Safeguard Children’s Rights on Video-Sharing Platforms: Jigsaw or Maze?","authors":"Valerie Verdoodt, Eva Lievens, Argyro Chatzinikolaou","doi":"10.17645/mac.v11i4.7059","DOIUrl":null,"url":null,"abstract":"Children are keen consumers of audiovisual media content. Video-sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child-friendly or child-appropriate content but also content which—depending on the age of the child—might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto-play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self-harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU legislation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy documents, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments.","PeriodicalId":18348,"journal":{"name":"Media and Communication","volume":"2017 38-39","pages":"0"},"PeriodicalIF":2.7000,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Media and Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17645/mac.v11i4.7059","RegionNum":2,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
引用次数: 1
Abstract
Children are keen consumers of audiovisual media content. Video-sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child-friendly or child-appropriate content but also content which—depending on the age of the child—might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto-play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self-harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU legislation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy documents, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments.
期刊介绍:
Media and Communication (ISSN: 2183-2439) is an international open access journal dedicated to a wide variety of basic and applied research in communication and its related fields