{"title":"Educators' data literacy","authors":"J. Raffaghelli","doi":"10.4324/9781003136842-5","DOIUrl":null,"url":null,"abstract":"Data extraction and algorithmic manipulation have become increasingly frequent across the globe – hence the expression “datafied society”. Initially perceived as an opportunity to innovate in several areas of human knowledge (Stephen-Davidowitz, 2017), data practices had unthinkable impacts. Race, gender and other vulnerable characteristics were made invisible, overrepresented and over-tracked by certain forms of data visualisation (Ricaurte, 2019). Automations based on machine learning have entailed perilous biases in the way they represent or neglect relevant cultural perspectives (Malik, 2020). Moreover, they have led to biases and users’ lack of agency or even harm (Eubanks, 2018; O’Neil, 2016). In education, research has also highlighted forms of surveillance on children, teenagers and young adults with the aim of addressing their behaviours, emotions and cognition (Chi, Jeng, Acker, & Bowler, 2018; Lupton & Williamson, 2017; Prinsloo, 2020). The “platformisation” of learning and the monetisation of students’ data are also pressing issues in the educational agenda. These problems were magnified by the intense and unprecedented use of digital technologies during the pandemic (Perrotta, Gulson, Williamson, & Witzenberger, 2020; Williamson, Eynon, & Potter, 2020).","PeriodicalId":240149,"journal":{"name":"Learning to Live with Datafication","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Learning to Live with Datafication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4324/9781003136842-5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Data extraction and algorithmic manipulation have become increasingly frequent across the globe – hence the expression “datafied society”. Initially perceived as an opportunity to innovate in several areas of human knowledge (Stephen-Davidowitz, 2017), data practices had unthinkable impacts. Race, gender and other vulnerable characteristics were made invisible, overrepresented and over-tracked by certain forms of data visualisation (Ricaurte, 2019). Automations based on machine learning have entailed perilous biases in the way they represent or neglect relevant cultural perspectives (Malik, 2020). Moreover, they have led to biases and users’ lack of agency or even harm (Eubanks, 2018; O’Neil, 2016). In education, research has also highlighted forms of surveillance on children, teenagers and young adults with the aim of addressing their behaviours, emotions and cognition (Chi, Jeng, Acker, & Bowler, 2018; Lupton & Williamson, 2017; Prinsloo, 2020). The “platformisation” of learning and the monetisation of students’ data are also pressing issues in the educational agenda. These problems were magnified by the intense and unprecedented use of digital technologies during the pandemic (Perrotta, Gulson, Williamson, & Witzenberger, 2020; Williamson, Eynon, & Potter, 2020).