How to Open Science: A Principle and Reproducibility Review of the Learning Analytics and Knowledge Conference

Aaron Haim, S. Shaw, N. Heffernan
{"title":"How to Open Science: A Principle and Reproducibility Review of the Learning Analytics and Knowledge Conference","authors":"Aaron Haim, S. Shaw, N. Heffernan","doi":"10.1145/3576050.3576071","DOIUrl":null,"url":null,"abstract":"Within the field of education technology, learning analytics has increased in popularity over the past decade. Researchers conduct experiments and develop software, building on each other’s work to create more intricate systems. In parallel, open science — which describes a set of practices to make research more open, transparent, and reproducible — has exploded in recent years, resulting in more open data, code, and materials for researchers to use. However, without prior knowledge of open science, many researchers do not make their datasets, code, and materials openly available, and those that are available are often difficult, if not impossible, to reproduce. The purpose of the current study was to take a close look at our field by examining previous papers within the proceedings of the International Conference on Learning Analytics and Knowledge, and document the rate of open science adoption (e.g., preregistration, open data), as well as how well available data and code could be reproduced. Specifically, we examined 133 research papers, allowing ourselves 15 minutes for each paper to identify open science practices and attempt to reproduce the results according to their provided specifications. Our results showed that less than half of the research adopted standard open science principles, with approximately 5% fully meeting some of the defined principles. Further, we were unable to reproduce any of the papers successfully in the given time period. We conclude by providing recommendations on how to improve the reproducibility of our research as a field moving forward. All openly accessible work can be found in an Open Science Foundation project1.","PeriodicalId":394433,"journal":{"name":"LAK23: 13th International Learning Analytics and Knowledge Conference","volume":"27 1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"LAK23: 13th International Learning Analytics and Knowledge Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3576050.3576071","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Within the field of education technology, learning analytics has increased in popularity over the past decade. Researchers conduct experiments and develop software, building on each other’s work to create more intricate systems. In parallel, open science — which describes a set of practices to make research more open, transparent, and reproducible — has exploded in recent years, resulting in more open data, code, and materials for researchers to use. However, without prior knowledge of open science, many researchers do not make their datasets, code, and materials openly available, and those that are available are often difficult, if not impossible, to reproduce. The purpose of the current study was to take a close look at our field by examining previous papers within the proceedings of the International Conference on Learning Analytics and Knowledge, and document the rate of open science adoption (e.g., preregistration, open data), as well as how well available data and code could be reproduced. Specifically, we examined 133 research papers, allowing ourselves 15 minutes for each paper to identify open science practices and attempt to reproduce the results according to their provided specifications. Our results showed that less than half of the research adopted standard open science principles, with approximately 5% fully meeting some of the defined principles. Further, we were unable to reproduce any of the papers successfully in the given time period. We conclude by providing recommendations on how to improve the reproducibility of our research as a field moving forward. All openly accessible work can be found in an Open Science Foundation project1.
如何开放科学:学习分析与知识会议的原则与可重复性综述
在教育技术领域,学习分析在过去十年中越来越受欢迎。研究人员进行实验和开发软件,以彼此的工作为基础,创造出更复杂的系统。与此同时,开放科学——它描述了一系列使研究更加开放、透明和可复制的实践——近年来爆炸式增长,为研究人员提供了更多开放的数据、代码和材料。然而,由于缺乏开放科学的先验知识,许多研究人员不会公开提供他们的数据集、代码和材料,而那些可用的数据集、代码和材料即使不是不可能,也很难复制。当前研究的目的是通过检查国际学习分析与知识会议论文集中的先前论文来仔细研究我们的领域,并记录开放科学的采用率(例如,预注册,开放数据),以及可获得的数据和代码可以复制的程度。具体来说,我们检查了133篇研究论文,每篇论文给我们15分钟的时间来确定开放科学实践,并尝试根据提供的规范重现结果。我们的研究结果表明,不到一半的研究采用了标准的开放科学原则,大约5%的研究完全符合某些已定义的原则。此外,我们无法在给定的时间段内成功地复制任何论文。最后,我们就如何提高我们研究的可重复性提出了建议。所有可公开访问的工作都可以在开放科学基金会项目中找到。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信