{"title":"研究设计可比效应大小对特殊教育中单一案例设计分析的影响。","authors":"Seth A King, Brendon Nylen, Olivia Enders, Lanqi Wang, Oluwatosin Opeoluwa","doi":"10.1037/spq0000628","DOIUrl":null,"url":null,"abstract":"<p><p>Initially excluded from many evaluations of education research, single-case designs have recently received wider acceptance within and beyond special education. The growing approval of single-case design has coincided with an increasing departure from convention, such as the visual analysis of results, and the emphasis on effect sizes comparable with those associated with group designs. The use of design-comparable effect sizes by the What Works Clearinghouse has potential implications for the experimental literature in special education, which is largely composed of single-case designs that may not meet the assumptions required for statistical analysis. This study examined the compatibility of single-case design studies appearing in 33 special education journals with the design-comparable effect sizes and related assumptions described by the What Works Clearinghouse. Of the 1,425 randomly selected single-case design articles published from 1999 to 2021, 59.88% did not satisfy assumptions related to design, number of participants, or treatment replications. The rejection rate varied based on journal emphasis, with publications dedicated to students with developmental disabilities losing the largest proportion of articles. A description of the results follows a discussion of the implications for the interpretation of the evidence base. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":74763,"journal":{"name":"School psychology (Washington, D.C.)","volume":" ","pages":"601-612"},"PeriodicalIF":0.0000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Examining the impact of design-comparable effect size on the analysis of single-case design in special education.\",\"authors\":\"Seth A King, Brendon Nylen, Olivia Enders, Lanqi Wang, Oluwatosin Opeoluwa\",\"doi\":\"10.1037/spq0000628\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Initially excluded from many evaluations of education research, single-case designs have recently received wider acceptance within and beyond special education. The growing approval of single-case design has coincided with an increasing departure from convention, such as the visual analysis of results, and the emphasis on effect sizes comparable with those associated with group designs. The use of design-comparable effect sizes by the What Works Clearinghouse has potential implications for the experimental literature in special education, which is largely composed of single-case designs that may not meet the assumptions required for statistical analysis. This study examined the compatibility of single-case design studies appearing in 33 special education journals with the design-comparable effect sizes and related assumptions described by the What Works Clearinghouse. Of the 1,425 randomly selected single-case design articles published from 1999 to 2021, 59.88% did not satisfy assumptions related to design, number of participants, or treatment replications. The rejection rate varied based on journal emphasis, with publications dedicated to students with developmental disabilities losing the largest proportion of articles. A description of the results follows a discussion of the implications for the interpretation of the evidence base. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>\",\"PeriodicalId\":74763,\"journal\":{\"name\":\"School psychology (Washington, D.C.)\",\"volume\":\" \",\"pages\":\"601-612\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"School psychology (Washington, D.C.)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1037/spq0000628\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/5/16 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"School psychology (Washington, D.C.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1037/spq0000628","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/5/16 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
单例设计最初被排除在许多教育研究评估之外,最近在特殊教育内外得到了更广泛的接受。在单例设计日益得到认可的同时,人们也越来越偏离常规,如对结果进行可视化分析,以及强调与群体设计相关的效果大小。What Works Clearinghouse 使用设计可比效应大小对特殊教育的实验文献有潜在的影响,因为特殊教育的实验文献主要由单例设计组成,而单例设计可能不符合统计分析所需的假设条件。本研究考察了 33 种特殊教育期刊中出现的单案例设计研究与 What Works Clearinghouse 所描述的设计可比效应大小及相关假设的兼容性。在 1999 年至 2021 年间发表的 1425 篇随机抽取的单一案例设计文章中,59.88% 的文章不符合与设计、参与者人数或治疗重复相关的假设。根据期刊的侧重点不同,拒收率也不同,其中专门针对发育障碍学生的刊物拒收的文章比例最大。在对结果进行描述之后,讨论了对证据基础解释的影响。(PsycInfo Database Record (c) 2024 APA, 版权所有)。
Examining the impact of design-comparable effect size on the analysis of single-case design in special education.
Initially excluded from many evaluations of education research, single-case designs have recently received wider acceptance within and beyond special education. The growing approval of single-case design has coincided with an increasing departure from convention, such as the visual analysis of results, and the emphasis on effect sizes comparable with those associated with group designs. The use of design-comparable effect sizes by the What Works Clearinghouse has potential implications for the experimental literature in special education, which is largely composed of single-case designs that may not meet the assumptions required for statistical analysis. This study examined the compatibility of single-case design studies appearing in 33 special education journals with the design-comparable effect sizes and related assumptions described by the What Works Clearinghouse. Of the 1,425 randomly selected single-case design articles published from 1999 to 2021, 59.88% did not satisfy assumptions related to design, number of participants, or treatment replications. The rejection rate varied based on journal emphasis, with publications dedicated to students with developmental disabilities losing the largest proportion of articles. A description of the results follows a discussion of the implications for the interpretation of the evidence base. (PsycInfo Database Record (c) 2024 APA, all rights reserved).