George-Petru Ciordas-Hertel, Daniel Biedermann, M. Winter, Julia Mordel, H. Drachsler
{"title":"How can Interaction Data be Contextualized with Mobile Sensing to Enhance Learning Engagement Assessment in Distance Learning?","authors":"George-Petru Ciordas-Hertel, Daniel Biedermann, M. Winter, Julia Mordel, H. Drachsler","doi":"10.1145/3536220.3558037","DOIUrl":null,"url":null,"abstract":"Multimodal learning analytics can enrich interaction data with contextual information through mobile sensing. Information about, for example, the physical environment, movement, physiological signals, or smart wearable usage. Through the use of smart wearables, contextual information can thus be captured and made available again to students in further processing steps so that they can reflect and annotate it. This paper describes a software infrastructure and a study design that successfully captured contextual information utilizing mobile sensing using students’ smart wearables in distance learning. In the conducted study, data was collected from the smartphones of 76 students as they self-directedly participated in an online learning unit using a learning management system (LMS) over a two-week period. During the students’ active phases in the LMS, interaction data as well as state and trait measurements were collected by the LMS. Simultaneously, hardware sensor data, app usage data, interaction with notifications, and ecological momentary assessments (EMA) were automatically but transparently collected from the students’ smartphones. Finally, this paper describes some preliminary insights from the study process and their implications for further data processing.","PeriodicalId":186796,"journal":{"name":"Companion Publication of the 2022 International Conference on Multimodal Interaction","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion Publication of the 2022 International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3536220.3558037","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Multimodal learning analytics can enrich interaction data with contextual information through mobile sensing. Information about, for example, the physical environment, movement, physiological signals, or smart wearable usage. Through the use of smart wearables, contextual information can thus be captured and made available again to students in further processing steps so that they can reflect and annotate it. This paper describes a software infrastructure and a study design that successfully captured contextual information utilizing mobile sensing using students’ smart wearables in distance learning. In the conducted study, data was collected from the smartphones of 76 students as they self-directedly participated in an online learning unit using a learning management system (LMS) over a two-week period. During the students’ active phases in the LMS, interaction data as well as state and trait measurements were collected by the LMS. Simultaneously, hardware sensor data, app usage data, interaction with notifications, and ecological momentary assessments (EMA) were automatically but transparently collected from the students’ smartphones. Finally, this paper describes some preliminary insights from the study process and their implications for further data processing.