Systems Thinking Assessment: A Method Through Computer Simulation

Ross Arnold, J. Wade, A. E. Bayrak
{"title":"Systems Thinking Assessment: A Method Through Computer Simulation","authors":"Ross Arnold, J. Wade, A. E. Bayrak","doi":"10.1115/detc2021-68180","DOIUrl":null,"url":null,"abstract":"\n This paper presents a novel assessment method for Systems Thinking and its supporting competencies. Systems Thinking is a key component in engineering education, providing students with the means to explore, understand, and design engineered systems both holistically and in terms of the relationships between their components, which can be both technical and non-technical. The assessment method described in this paper is implemented as a software simulation of a domain-agnostic system to support integration of Systems Thinking into engineering education. The simulation was tested with a group of beta-testers and then fully deployed online as a freely available tool for a 6-month experimental period. The pool of volunteer participants included students and mixed professionals from a diverse set of geographical, educational, and career backgrounds. Results of the assessment show success at both evaluating Systems Thinking Maturity as a whole, and at assessing complex facets of Systems Thinking that have eluded assessment in prior methods. The tool shows promise at evaluating competencies within all four Systems Thinking domains — Mindset, Content, Structure, and Behavior. These domains contain key systemic skills such as the ability to recognize interconnections and feedback loops, see non-linear causal relationships, and understand dynamic behavior. When examined holistically through multiple regression analysis, participants’ scores in the 11 assessed competencies show a moderate to high ability to predict their levels of overall Systems Thinking performance in the simulation. The results also reveal previously unknown dependencies and strengths of relationships between Systems Thinking competencies.","PeriodicalId":23602,"journal":{"name":"Volume 2: 41st Computers and Information in Engineering Conference (CIE)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Volume 2: 41st Computers and Information in Engineering Conference (CIE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/detc2021-68180","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a novel assessment method for Systems Thinking and its supporting competencies. Systems Thinking is a key component in engineering education, providing students with the means to explore, understand, and design engineered systems both holistically and in terms of the relationships between their components, which can be both technical and non-technical. The assessment method described in this paper is implemented as a software simulation of a domain-agnostic system to support integration of Systems Thinking into engineering education. The simulation was tested with a group of beta-testers and then fully deployed online as a freely available tool for a 6-month experimental period. The pool of volunteer participants included students and mixed professionals from a diverse set of geographical, educational, and career backgrounds. Results of the assessment show success at both evaluating Systems Thinking Maturity as a whole, and at assessing complex facets of Systems Thinking that have eluded assessment in prior methods. The tool shows promise at evaluating competencies within all four Systems Thinking domains — Mindset, Content, Structure, and Behavior. These domains contain key systemic skills such as the ability to recognize interconnections and feedback loops, see non-linear causal relationships, and understand dynamic behavior. When examined holistically through multiple regression analysis, participants’ scores in the 11 assessed competencies show a moderate to high ability to predict their levels of overall Systems Thinking performance in the simulation. The results also reveal previously unknown dependencies and strengths of relationships between Systems Thinking competencies.
系统思维评估:一种基于计算机模拟的方法
本文提出了一种新的评估系统思维及其支持能力的方法。系统思维是工程教育的一个关键组成部分,它为学生提供了探索、理解和设计工程系统的方法,既可以是整体的,也可以是技术的,也可以是非技术的。本文描述的评估方法是作为一个领域不可知系统的软件模拟来实现的,以支持将系统思维融入工程教育。模拟测试由一组beta测试人员进行测试,然后作为免费工具在网上全面部署,为期6个月的实验期。志愿者包括来自不同地域、教育和职业背景的学生和混合专业人士。评估的结果表明,在整体上评估系统思维成熟度和评估系统思维的复杂方面都是成功的,这些方面在以前的方法中无法评估。该工具在评估所有四个系统思维领域(心态、内容、结构和行为)的能力方面表现出了希望。这些领域包含关键的系统技能,如识别相互联系和反馈循环的能力,看到非线性因果关系,以及理解动态行为的能力。当通过多元回归分析进行整体检查时,参与者在11项评估能力中的得分显示出中等到高的预测他们在模拟中的整体系统思维表现水平的能力。结果还揭示了以前未知的依赖关系和系统思维能力之间的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信