{"title":"Exploring the multi-dimensional human mind: Model-based and text-based approaches","authors":"","doi":"10.1016/j.asw.2024.100878","DOIUrl":null,"url":null,"abstract":"<div><p>In this study, we conceptualize two approaches, model-based and text-based, grounded on mental models and discourse comprehension theories, to computerized summary analysis. We juxtapose the model-based approach with the text-based approach to explore shared knowledge dimensions and associated measures from both approaches and use them to examine changes in students' summaries over time. We used 108 cases in which we computed model-based and text-based measures for two versions of students' summaries (i.e., initial and final revisions), resulting in a total of 216 observations. We used correlations, Principal Components Analysis (PCA), and Linear Mixed-Effects models. This exploratory investigation suggested a shortlist of text-based measures, and the findings of the PCA demonstrated that both model-based and text-based measures explained the three-dimensional model (i.e., surface, structure, and semantic). Overall, model-based measures were better for tracking changes in the surface dimension, while text-based measures were descriptive of the structure dimension. Both approaches worked well for the semantic dimension. The tested text-based measures can serve as a cross-reference to evaluate students' summaries along with the model-based measures. The current study shows the potential of using multidimensional measures to provide formative feedback on students' knowledge structure and writing styles along the three dimensions.</p></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":null,"pages":null},"PeriodicalIF":4.2000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000710","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
In this study, we conceptualize two approaches, model-based and text-based, grounded on mental models and discourse comprehension theories, to computerized summary analysis. We juxtapose the model-based approach with the text-based approach to explore shared knowledge dimensions and associated measures from both approaches and use them to examine changes in students' summaries over time. We used 108 cases in which we computed model-based and text-based measures for two versions of students' summaries (i.e., initial and final revisions), resulting in a total of 216 observations. We used correlations, Principal Components Analysis (PCA), and Linear Mixed-Effects models. This exploratory investigation suggested a shortlist of text-based measures, and the findings of the PCA demonstrated that both model-based and text-based measures explained the three-dimensional model (i.e., surface, structure, and semantic). Overall, model-based measures were better for tracking changes in the surface dimension, while text-based measures were descriptive of the structure dimension. Both approaches worked well for the semantic dimension. The tested text-based measures can serve as a cross-reference to evaluate students' summaries along with the model-based measures. The current study shows the potential of using multidimensional measures to provide formative feedback on students' knowledge structure and writing styles along the three dimensions.
期刊介绍:
Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.