J. Learn. Anal.Pub Date : 2021-09-03DOI: 10.18608/jla.2021.7375
Andrew E. Krumm, Jared Boyce, H. Everson
{"title":"A Collaborative Approach to Sharing Learner Event Data","authors":"Andrew E. Krumm, Jared Boyce, H. Everson","doi":"10.18608/jla.2021.7375","DOIUrl":"https://doi.org/10.18608/jla.2021.7375","url":null,"abstract":"This paper describes a collaboration organized around exchanging data between two technological systems to support teachers’ instructional decision-making. The goals of the collaboration among researchers, technology developers, and practitioners were not only to support teachers’ instructional decision-making but also to document the challenges and opportunities associated with bringing together data from instruction- and assessment-focused technologies. The approach described in this paper illustrates the potential importance of anchoring data products that combine data between two systems in the needs of teachers as well as aligning the content that students learn and are assessed on between systems. The increasing presence of data standards has made sharing complex data increasingly more feasible. The example collaboration described in this paper demonstrates the role that non-technical activities can play in supporting the exchange and use of learner event data.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127890322","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-08-31DOI: 10.18608/jla.2022.7250
Elaine Farrow, Johanna D. Moore, D. Gašević
{"title":"Markers of Cognitive Quality in Student Contributions to Online Course Discussion Forums","authors":"Elaine Farrow, Johanna D. Moore, D. Gašević","doi":"10.18608/jla.2022.7250","DOIUrl":"https://doi.org/10.18608/jla.2022.7250","url":null,"abstract":"By participating in asynchronous course discussion forums, students can work together to refine their ideas and construct knowledge collaboratively. Typically, some messages simply repeat or paraphrase course content, while others bring in new material, demonstrate reasoning, integrate concepts, and develop solutions. Through the messages they send, students thus display different levels of intellectual engagement with the topic and the course. We refer to this as cognitive quality. The work presented here used two widely studied frameworks for assessing critical discourse and cognitive engagement: the ICAP and Community of Inquiry frameworks. The constructs of the frameworks were used as proxy measures for cognitive quality. Predictive classifiers were trained for both frameworks on the same data in order to discover which attributes of the dialogue were most informative and how those attributes were correlated with framework constructs. We found that longer and more complex messages were associated with indicators of greater quality in both frameworks, and that the threaded reply structure mattered more than chronological order. By including the framework labels as additional model features, we also assessed the links between frameworks. The empirical results provide evidence that the two frameworks measure different aspects of student behaviour relating to cognitive quality.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115151901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-07-07DOI: 10.18608/JLA.2021.7113
Fabio Campos, June Ahn, Daniela K. Digiacomo, Ha Nguyen, Maria Hays
{"title":"Making Sense of Sensemaking: Understanding How K-12 Teachers and Coaches React to Visual Analytics","authors":"Fabio Campos, June Ahn, Daniela K. Digiacomo, Ha Nguyen, Maria Hays","doi":"10.18608/JLA.2021.7113","DOIUrl":"https://doi.org/10.18608/JLA.2021.7113","url":null,"abstract":"With the spread of learning analytics (LA) dashboards in K--12 schools, educators are increasingly expected to make sense of data to inform instruction. However, numerous features of school settings, such as specialized vantage points of educators, may lead to different ways of looking at data. This observation motivates the need to carefully observe and account for the ways data sensemaking occurs, and how it may differ across K--12 professional roles. Our mixed-methods study reports on interviews and think-aloud sessions with middle-school mathematics teachers and instructional coaches from four districts in the United States. By exposing educators to an LA dashboard, we map their varied reactions to visual data and reveal prevalent sensemaking patterns. We find that emotional, analytical, and intentional responses inform educators’ sensemaking and that different roles at the school afford unique vantage points toward data. Based on these findings, we offer a typology for representing sensemaking in a K--12 school context and reflect on how to expand visual LA process models.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125100685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-07-05DOI: 10.18608/jla.2021.6590
Florian Sense, M. V. D. Velde, H. Rijn
{"title":"Predicting University Students' Exam Performance Using a Model-Based Adaptive Fact-Learning System","authors":"Florian Sense, M. V. D. Velde, H. Rijn","doi":"10.18608/jla.2021.6590","DOIUrl":"https://doi.org/10.18608/jla.2021.6590","url":null,"abstract":"Modern educational technology has the potential to support students to use their study time more effectively. Learning analytics can indicate relevant individual differences between learners, which adaptive learning systems can use to tailor the learning experience to individual learners. For fact learning, cognitive models of human memory are well suited to tracing learners’ acquisition and forgetting of knowledge over time. Such models have shown great promise in controlled laboratory studies. To work in realistic educational settings, however, they need to be easy to deploy and their adaptive components should be based on individual differences relevant to the educational context and outcomes. Here, we focus on predicting university students’ exam performance using a model-based adaptive fact-learning system. The data presented here indicate that the system provides tangible benefits to students in naturalistic settings. The model’s estimate of a learner’s rate of forgetting predicts overall grades and performance on individual exam questions. This encouraging case study highlights the value of model-based adaptive fact-learning systems in classrooms","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126296180","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-12DOI: 10.18608/JLA.2021.7153
Rachel Dickler, J. Gobert, M. S. Pedro
{"title":"Using Innovative Methods to Explore the Potential of an Alerting Dashboard for Science Inquiry","authors":"Rachel Dickler, J. Gobert, M. S. Pedro","doi":"10.18608/JLA.2021.7153","DOIUrl":"https://doi.org/10.18608/JLA.2021.7153","url":null,"abstract":"Educational technologies, such as teacher dashboards, are being developed to support teachers’ instruction and students’ learning. Specifically, dashboards support teachers in providing the just-in-time instruction needed by students in complex contexts such as science inquiry. In this study, we used the Inq-Blotter teacher-alerting dashboard to investigate whether teacher support elicited by the technology influenced students’ inquiry performance in a science intelligent tutoring system, Inq-ITS. Results indicated that students’ inquiry improved after receiving teachers’ help, elicited by the Inq-Blotter alerts. This inquiry improvement was significantly greater than for matched students who did not receive help from the teacher in response to alerts. Epistemic network analyses were then used to investigate the patterns in the discursive supports provided to students by teachers. These analyses revealed significant differences in the types of support that fostered (versus did not foster) student improvement; differences across teachers were also found. Overall, this study used innovative tools and analyses to understand how teachers use this technological genre of alerting dashboards to dynamically support students in science inquiry.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134150287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-09DOI: 10.18608/JLA.2021.7227
Roberto Martínez-Maldonado, D. Gašević, Vanessa Echeverría, Gloria Fernández-Nieto, Z. Swiecki, S. B. Shum
{"title":"What Do You Mean by Collaboration Analytics? A Conceptual Model","authors":"Roberto Martínez-Maldonado, D. Gašević, Vanessa Echeverría, Gloria Fernández-Nieto, Z. Swiecki, S. B. Shum","doi":"10.18608/JLA.2021.7227","DOIUrl":"https://doi.org/10.18608/JLA.2021.7227","url":null,"abstract":"Using data to generate a deeper understanding of collaborative learning is not new, but automatically analyzing log data has enabled new means of identifying key indicators of effective collaboration and teamwork that can be used to predict outcomes and personalize feedback. Collaboration analytics is emerging as a new term to refer to computational methods for identifying salient aspects of collaboration from multiple group data sources for learners, educators, or other stakeholders to gain and act upon insights. Yet, it remains unclear how collaboration analytics go beyond previous work focused on modelling group interactions for the purpose of adapting instruction. This paper provides a conceptual model of collaboration analytics to help researchers and designers identify the opportunities enabled by such innovations to advance knowledge in, and provide enhanced support for, collaborative learning and teamwork. We argue that mapping from low-level data to higher-order constructs that are educationally meaningful, and that can be understood by educators and learners, is essential to assessing the validity of collaboration analytics. Through four cases, the paper illustrates the critical role of theory, task design, and human factors in the design of interfaces that inform actionable insights for improving collaboration and group learning.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"173 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124220231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-09DOI: 10.18608/JLA.2021.7242
M. Worsley, Khalil J. Anderson, N. Melo, JooYoung Jang
{"title":"Designing Analytics for Collaboration Literacy and Student Empowerment","authors":"M. Worsley, Khalil J. Anderson, N. Melo, JooYoung Jang","doi":"10.18608/JLA.2021.7242","DOIUrl":"https://doi.org/10.18608/JLA.2021.7242","url":null,"abstract":"Collaboration has garnered global attention as an important skill for the 21st century. While researchers have been doing work on collaboration for nearly a century, many of the questions that the field is investigating overlook the need for students to learn how to read and respond to different collaborative settings. Existing research focuses on chronicling the various factors that predict the effectiveness of a collaborative experience, or on changing user behaviour in the moment. These are worthwhile research endeavours for developing our theoretical understanding of collaboration. However, there is also a need to centre student perceptions and experiences with collaboration as an important area of inquiry. Based on a survey of 131 university students, we find that student collaboration-related concerns can be represented across seven different categories or dimensions: Climate, Compatibility, Communication, Conflict, Context, Contribution, and Constructive. These categories extend prior research on collaboration and can help the field ensure that future collaboration analytics tools are designed to support the ways that students think about and utilize collaboration. Finally, we describe our instantiation of many of these dimensions in our collaborative analytics tool, BLINC, and suggest that these seven dimensions can be instructive for re-orienting the Multimodal Learning Analytics (MMLA) and collaboration analytics communities.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128889323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-09DOI: 10.18608/JLA.2021.7240
Z. Swiecki
{"title":"Measuring the Impact of Interdependence on Individuals During Collaborative Problem-Solving","authors":"Z. Swiecki","doi":"10.18608/JLA.2021.7240","DOIUrl":"https://doi.org/10.18608/JLA.2021.7240","url":null,"abstract":"Collaboration analytics often focuses on assessing and monitoring individuals during collaborative problem-solving (CPS). A defining feature of CPS is the interdependence that exists between individuals when they work together — that is, how they respond to and influence one another over time. While models that account for the impact of interdependence at the individual level of analysis (interdependent models) exist, they are often highly complex. This complexity makes them potentially difficult to use in assessments and systems that need to be explainable for educators, learners, and other researchers. Measures of the impact of interdependence at the individual level of analysis could inform decisions as to whether interdependent models should be used, or whether simpler models will suffice. Such measures could also be used to investigate specific questions about interdependence in collaborative settings. In this paper, I present a novel method of measuring the impact of interdependence on individuals using epistemic network analysis. To provide evidence of the validity of the measure, I compare it to qualitative findings that describe the impact of interdependence on individuals participating in team training scenarios. To demonstrate the value of the measure, I use it to assess the impact of interdependence in these data overall and to test hypotheses regarding the collaborative task design. My results suggest that the measure can distinguish between individuals who have been impacted by interdependence differently, that interdependence is impactful in these data overall, and that aspects of the task design may have affected how some individuals were impacted by interdependence.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"20 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121008643","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-09DOI: 10.18608/JLA.2021.7230
Mona Emara, Nicole M. Hutchins, Shuchi Grover, Caitlin Snyder, G. Biswas
{"title":"Examining Student Regulation of Collaborative, Computational, Problem-Solving Processes in Open-Ended Learning Environments","authors":"Mona Emara, Nicole M. Hutchins, Shuchi Grover, Caitlin Snyder, G. Biswas","doi":"10.18608/JLA.2021.7230","DOIUrl":"https://doi.org/10.18608/JLA.2021.7230","url":null,"abstract":"The integration of computational modelling in science classrooms provides a unique opportunity to promote key 21st century skills including computational thinking (CT) and collaboration. The open-ended, problem-solving nature of the task requires groups to grapple with the combination of two domains (science and computing) as they collaboratively construct computational models. While this approach has produced significant learning gains for students in both science and CT in K–12 settings, the collaborative learning processes students use, including learner regulation, are not well understood. In this paper, we present a systematic analysis framework that combines natural language processing (NLP) of collaborative dialogue, log file analyses of students’ model-building actions, and final model scores. This analysis is used to better understand students’ regulation of collaborative problem solving (CPS) processes over a series of computational modelling tasks of varying complexity. The results suggest that the computational modelling challenges afford opportunities for students to a) explore resource-intensive processes, such as trial and error, to more systematic processes, such as debugging model errors by leveraging data tools, and b) learn from each other using socially shared regulation (SSR) and productive collaboration. The use of such SSR processes correlated positively with their model-building scores. Our paper aims to advance our understanding of collaborative, computational modelling in K–12 science to better inform classroom applications.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116377717","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Learn. Anal.Pub Date : 2021-04-09DOI: 10.18608/JLA.2021.7241
Peter F. Halpin
{"title":"Measuring Dynamical Interdependence in Small-Group Collaborations","authors":"Peter F. Halpin","doi":"10.18608/JLA.2021.7241","DOIUrl":"https://doi.org/10.18608/JLA.2021.7241","url":null,"abstract":"This paper addresses dynamical interdependence among the actions of group members. I assume that the actions of each member can be represented as nodes of a dynamical network and then collect the nodes into disjoint subsets (components) representing the individual group members. Interdependence among group members’ actions can then be defined with reference to a K-partite network, in which the partitions correspond to the group member components. Independence among group members’ actions can be defined with reference to a network in which the group member components are disconnected from one another. The degree to which the interactions of actual groups correspond to either of these theoretical network structures can be characterized using modified versions of existing network statistics. Taking this approach, I propose a number of network-based measures of dynamical interdependence, discuss the interpretation of the proposed measures, and consider how to assess their reliability and validity. These ideas are illustrated using an example in which dyads collaborated via online chat to complete a grade 12 level mathematics assessment.","PeriodicalId":145357,"journal":{"name":"J. Learn. Anal.","volume":"122 23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121335195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}