{"title":"便携式干预措施对学校心理学家图表评分不一致的影响。","authors":"Alexander D Latham, David A. Klingbeil","doi":"10.1037/spq0000629","DOIUrl":null,"url":null,"abstract":"The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts' ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per X- to Y-axis ratio. We developed and tested two brief interventions, based on the research in cognitive and visual science, to reduce visual analysts' inconsistency when viewing unstandardized graphs. Two hundred practicing school psychologists visually analyzed data presented on standardized graphs and the same data again on unstandardized graphs. Across all conditions, participants were more willing to identify meaningful effects on unstandardized graphs and rated the data as showing significantly larger effects than on the corresponding standardized graphs. However, participants who answered additional (task-relevant) questions about the level or trend of graphed data showed greater rating consistency across the types of graphs in comparison to participants who answered task-irrelevant but challenging questions or control participants. Our results replicated prior research demonstrating the impact of SCD graph construction on practicing school psychologists' interpretations and provide initial support for an intervention to minimize the impact of construct-irrelevant factors. Limitations and future directions for research are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Effects of portable interventions on school psychologists' graph-rating inconsistency.\",\"authors\":\"Alexander D Latham, David A. Klingbeil\",\"doi\":\"10.1037/spq0000629\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts' ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per X- to Y-axis ratio. We developed and tested two brief interventions, based on the research in cognitive and visual science, to reduce visual analysts' inconsistency when viewing unstandardized graphs. Two hundred practicing school psychologists visually analyzed data presented on standardized graphs and the same data again on unstandardized graphs. Across all conditions, participants were more willing to identify meaningful effects on unstandardized graphs and rated the data as showing significantly larger effects than on the corresponding standardized graphs. However, participants who answered additional (task-relevant) questions about the level or trend of graphed data showed greater rating consistency across the types of graphs in comparison to participants who answered task-irrelevant but challenging questions or control participants. Our results replicated prior research demonstrating the impact of SCD graph construction on practicing school psychologists' interpretations and provide initial support for an intervention to minimize the impact of construct-irrelevant factors. Limitations and future directions for research are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).\",\"PeriodicalId\":3,\"journal\":{\"name\":\"ACS Applied Electronic Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Electronic Materials\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1037/spq0000629\",\"RegionNum\":3,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/spq0000629","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
摘要
在学校心理学的单案例设计(SCD)研究和应用实践中,对时间序列图中的数据进行可视化分析十分常见。越来越多的研究表明,视觉分析师的评分往往会受到与建构无关的特征的影响,包括Y轴截断和压缩X轴与Y轴的数据点数比。我们以认知科学和视觉科学的研究为基础,开发并测试了两种简短的干预措施,以减少视觉分析师在查看未标准化图表时的不一致性。两百名执业学校心理学家对标准化图表上的数据和未标准化图表上的相同数据进行了视觉分析。在所有条件下,参与者都更愿意在非标准化图表上识别有意义的效果,并认为数据显示的效果明显大于相应的标准化图表。然而,与回答了与任务无关但具有挑战性的问题的参与者或对照组参与者相比,回答了有关图表数据水平或趋势的附加(任务相关)问题的参与者在不同类型的图表上表现出了更大的评分一致性。我们的研究结果重复了之前的研究,证明了 SCD 图表构建对学校心理学家解释的影响,并为干预措施提供了初步支持,以最大限度地减少与建构无关的因素的影响。本文还讨论了研究的局限性和未来研究方向。(PsycInfo Database Record (c) 2024 APA,保留所有权利)。
Effects of portable interventions on school psychologists' graph-rating inconsistency.
The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts' ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per X- to Y-axis ratio. We developed and tested two brief interventions, based on the research in cognitive and visual science, to reduce visual analysts' inconsistency when viewing unstandardized graphs. Two hundred practicing school psychologists visually analyzed data presented on standardized graphs and the same data again on unstandardized graphs. Across all conditions, participants were more willing to identify meaningful effects on unstandardized graphs and rated the data as showing significantly larger effects than on the corresponding standardized graphs. However, participants who answered additional (task-relevant) questions about the level or trend of graphed data showed greater rating consistency across the types of graphs in comparison to participants who answered task-irrelevant but challenging questions or control participants. Our results replicated prior research demonstrating the impact of SCD graph construction on practicing school psychologists' interpretations and provide initial support for an intervention to minimize the impact of construct-irrelevant factors. Limitations and future directions for research are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved).