{"title":"Enriching Multimodal Data","authors":"Yiqiu Zhou, Jina Kang","doi":"10.18608/jla.2023.7989","DOIUrl":null,"url":null,"abstract":"Collaboration is a complex, multidimensional process; however, details of how multimodal features intersect and mediate group interactions have not been fully unpacked. Characterizing and analyzing the temporal patterns based on multimodal features is a challenging yet important work to advance our understanding of computer-supported collaborative learning (CSCL). This paper highlights the affordances, as well as the limitations, of different temporal approaches in terms of analyzing multimodal data. To tackle the remaining challenges, we present an empirical example of multimodal temporal analysis that leverages multi-level vector autoregression (mlVAR) to identify temporal patterns of the collaborative problem-solving (CPS) process in an immersive astronomy simulation. We extend previous research on joint attention with a particular focus on the added value from a multimodal, temporal account of the CPS process. We incorporate verbal discussion to contextualize joint attention, examine the sequential and contemporaneous associations between them, and identify significant differences in temporal patterns between low- and high-achieving groups. Our paper does the following: 1) creates interpretable multimodal group interaction patterns, 2) advances understanding of CPS through examination of verbal and non-verbal interactions, and 3) demonstrates the added value of a complete account of temporality including both duration and sequential order.","PeriodicalId":36754,"journal":{"name":"Journal of Learning Analytics","volume":"7 5","pages":"0"},"PeriodicalIF":2.9000,"publicationDate":"2023-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Learning Analytics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18608/jla.2023.7989","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
Collaboration is a complex, multidimensional process; however, details of how multimodal features intersect and mediate group interactions have not been fully unpacked. Characterizing and analyzing the temporal patterns based on multimodal features is a challenging yet important work to advance our understanding of computer-supported collaborative learning (CSCL). This paper highlights the affordances, as well as the limitations, of different temporal approaches in terms of analyzing multimodal data. To tackle the remaining challenges, we present an empirical example of multimodal temporal analysis that leverages multi-level vector autoregression (mlVAR) to identify temporal patterns of the collaborative problem-solving (CPS) process in an immersive astronomy simulation. We extend previous research on joint attention with a particular focus on the added value from a multimodal, temporal account of the CPS process. We incorporate verbal discussion to contextualize joint attention, examine the sequential and contemporaneous associations between them, and identify significant differences in temporal patterns between low- and high-achieving groups. Our paper does the following: 1) creates interpretable multimodal group interaction patterns, 2) advances understanding of CPS through examination of verbal and non-verbal interactions, and 3) demonstrates the added value of a complete account of temporality including both duration and sequential order.