{"title":"How to Complement Learning Analytics with Smartwatches?: Fusing Physical Activities, Environmental Context, and Learning Activities","authors":"George-Petru Ciordas-Hertel","doi":"10.1145/3382507.3421151","DOIUrl":null,"url":null,"abstract":"To obtain a holistic perspective on learning, a multimodal technical infrastructure for Learning Analytics (LA) can be beneficial. Recent studies have investigated various aspects of technical LA infrastructure. However, it has not yet been explored how LA indicators can be complemented with Smartwatch sensor data to detect physical activity and the environmental context. Sensor data, such as the accelerometer, are often used in related work to infer a specific behavior and environmental context, thus triggering interventions on a just-in-time basis. In this dissertation project, we plan to use Smartwatch sensor data to explore further indicators for learning from blended learning sessions conducted in-the-wild, e.g., at home. Such indicators could be used within learning sessions to suggest breaks, or afterward to support learners in reflection processes. We plan to investigate the following three research questions: (RQ1) How can multimodal learning analytics infrastructure be designed to support real-time data acquisition and processing effectively?; (RQ2) how to use smartwatch sensor data to infer environmental context and physical activities to complement learning analytics indicators for blended learning sessions; and (RQ3) how can we align the extracted multimodal indicators with pedagogical interventions. RQ1 was investigated by a structured literature review and by conducting eleven semi-structured interviews with LA infrastructure developers. According to RQ2, we are currently designing and implementing a multimodal learning analytics infrastructure to collect and process sensor and experience data from Smartwatches. Finally, according to RQ3, an exploratory field study will be conducted to extract multimodal learning indicators and examine them with learners and pedagogical experts to develop effective interventions. Researchers, educators, and learners can use and adapt our contributions to gain new insights into learners' time and learning tactics, and physical learning spaces from learning sessions taking place in-the-wild.","PeriodicalId":402394,"journal":{"name":"Proceedings of the 2020 International Conference on Multimodal Interaction","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3382507.3421151","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
To obtain a holistic perspective on learning, a multimodal technical infrastructure for Learning Analytics (LA) can be beneficial. Recent studies have investigated various aspects of technical LA infrastructure. However, it has not yet been explored how LA indicators can be complemented with Smartwatch sensor data to detect physical activity and the environmental context. Sensor data, such as the accelerometer, are often used in related work to infer a specific behavior and environmental context, thus triggering interventions on a just-in-time basis. In this dissertation project, we plan to use Smartwatch sensor data to explore further indicators for learning from blended learning sessions conducted in-the-wild, e.g., at home. Such indicators could be used within learning sessions to suggest breaks, or afterward to support learners in reflection processes. We plan to investigate the following three research questions: (RQ1) How can multimodal learning analytics infrastructure be designed to support real-time data acquisition and processing effectively?; (RQ2) how to use smartwatch sensor data to infer environmental context and physical activities to complement learning analytics indicators for blended learning sessions; and (RQ3) how can we align the extracted multimodal indicators with pedagogical interventions. RQ1 was investigated by a structured literature review and by conducting eleven semi-structured interviews with LA infrastructure developers. According to RQ2, we are currently designing and implementing a multimodal learning analytics infrastructure to collect and process sensor and experience data from Smartwatches. Finally, according to RQ3, an exploratory field study will be conducted to extract multimodal learning indicators and examine them with learners and pedagogical experts to develop effective interventions. Researchers, educators, and learners can use and adapt our contributions to gain new insights into learners' time and learning tactics, and physical learning spaces from learning sessions taking place in-the-wild.