Exploring Emergent Features of Student Interaction within an Embodied Science Learning Simulation

IF 2.4 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jina Kang, Robb Lindgren, James Planey
{"title":"Exploring Emergent Features of Student Interaction within an Embodied Science Learning Simulation","authors":"Jina Kang, Robb Lindgren, James Planey","doi":"10.3390/MTI2030039","DOIUrl":null,"url":null,"abstract":"Theories of embodied cognition argue that human processes of thinking and reasoning are deeply connected with the actions and perceptions of the body. Recent research suggests that these theories can be successfully applied to the design of learning environments, and new technologies enable multimodal platforms that respond to students’ natural physical activity such as their gestures. This study examines how students engaged with an embodied mixed-reality science learning simulation using advanced gesture recognition techniques to support full-body interaction. The simulation environment acts as a communication platform for students to articulate their understanding of non-linear growth within different science contexts. In particular, this study investigates the different multimodal interaction metrics that were generated as students attempted to make sense of cross-cutting science concepts through using a personalized gesture scheme. Starting with video recordings of students’ full-body gestures, we examined the relationship between these embodied expressions and their subsequent success reasoning about non-linear growth. We report the patterns that we identified, and explicate our findings by detailing a few insightful cases of student interactions. Implications for the design of multimodal interaction technologies and the metrics that were used to investigate different types of students’ interactions while learning are discussed.","PeriodicalId":52297,"journal":{"name":"Multimodal Technologies and Interaction","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2018-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.3390/MTI2030039","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multimodal Technologies and Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/MTI2030039","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 10

Abstract

Theories of embodied cognition argue that human processes of thinking and reasoning are deeply connected with the actions and perceptions of the body. Recent research suggests that these theories can be successfully applied to the design of learning environments, and new technologies enable multimodal platforms that respond to students’ natural physical activity such as their gestures. This study examines how students engaged with an embodied mixed-reality science learning simulation using advanced gesture recognition techniques to support full-body interaction. The simulation environment acts as a communication platform for students to articulate their understanding of non-linear growth within different science contexts. In particular, this study investigates the different multimodal interaction metrics that were generated as students attempted to make sense of cross-cutting science concepts through using a personalized gesture scheme. Starting with video recordings of students’ full-body gestures, we examined the relationship between these embodied expressions and their subsequent success reasoning about non-linear growth. We report the patterns that we identified, and explicate our findings by detailing a few insightful cases of student interactions. Implications for the design of multimodal interaction technologies and the metrics that were used to investigate different types of students’ interactions while learning are discussed.
探索具身科学学习模拟中学生互动的涌现特征
具身认知理论认为,人类的思考和推理过程与身体的行为和感知密切相关。最近的研究表明,这些理论可以成功地应用于学习环境的设计,新技术使多模式平台能够响应学生的自然身体活动,如他们的手势。本研究考察了学生如何使用先进的手势识别技术来支持全身互动,并参与到具体的混合现实科学学习模拟中。模拟环境作为一个交流平台,让学生在不同的科学背景下表达他们对非线性增长的理解。特别是,本研究调查了不同的多模态交互指标,这些指标是在学生试图通过使用个性化的手势方案来理解交叉科学概念时产生的。从学生全身手势的录像开始,我们研究了这些具身表达与他们随后对非线性增长的成功推理之间的关系。我们报告了我们发现的模式,并通过详细介绍几个有见地的学生互动案例来解释我们的发现。本文讨论了设计多模态交互技术的意义,以及用于调查不同类型学生在学习过程中的交互的指标。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Multimodal Technologies and Interaction
Multimodal Technologies and Interaction Computer Science-Computer Science Applications
CiteScore
4.90
自引率
8.00%
发文量
94
审稿时长
4 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信