{"title":"What is next in mobile-assisted reading? Insights from a decade of eye tracking research into cognitive processes","authors":"","doi":"10.1016/j.edurev.2024.100643","DOIUrl":null,"url":null,"abstract":"<div><div>Mobile-assisted reading research has seen a growing trend in the use of eye tracking to explore readers’ performance traditionally examined by offline accuracy measures. Through its ability to provide detailed records of online processing behaviours at a high temporal resolution, eye tracking offers new insights into real-time cognitive processes associated with mobile digital reading, which offline measures are unable to do. Despite its unique advantages, previous systematic reviews have mainly focused on offline performance to compare mobile-assisted versus traditional reading, with a lack of attention to online performance using eye tracking in the literature. As interest in and the availability of eye tracking continues to expand, a systematic review is timely to identify the issues that have already been addressed, if research gaps remain, and any warranted future work in this regard. In doing so, the current review aims to provide a comprehensive and systematic overview of mobile-assisted reading research using eye tracking from 2010 to 2022, including article information, research focus, technology, and method. Additionally, this review critically discusses the limitations of previous research and proposes the avenues for future endeavours.</div></div>","PeriodicalId":48125,"journal":{"name":"Educational Research Review","volume":null,"pages":null},"PeriodicalIF":9.6000,"publicationDate":"2024-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Research Review","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1747938X24000526","RegionNum":1,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Mobile-assisted reading research has seen a growing trend in the use of eye tracking to explore readers’ performance traditionally examined by offline accuracy measures. Through its ability to provide detailed records of online processing behaviours at a high temporal resolution, eye tracking offers new insights into real-time cognitive processes associated with mobile digital reading, which offline measures are unable to do. Despite its unique advantages, previous systematic reviews have mainly focused on offline performance to compare mobile-assisted versus traditional reading, with a lack of attention to online performance using eye tracking in the literature. As interest in and the availability of eye tracking continues to expand, a systematic review is timely to identify the issues that have already been addressed, if research gaps remain, and any warranted future work in this regard. In doing so, the current review aims to provide a comprehensive and systematic overview of mobile-assisted reading research using eye tracking from 2010 to 2022, including article information, research focus, technology, and method. Additionally, this review critically discusses the limitations of previous research and proposes the avenues for future endeavours.
期刊介绍:
Educational Research Review is an international journal catering to researchers and diverse agencies keen on reviewing studies and theoretical papers in education at any level. The journal welcomes high-quality articles that address educational research problems through a review approach, encompassing thematic or methodological reviews and meta-analyses. With an inclusive scope, the journal does not limit itself to any specific age range and invites articles across various settings where learning and education take place, such as schools, corporate training, and both formal and informal educational environments.