Comparing instructor-led, video-model, and no-instruction control tutorials for creating single-subject graphs in Microsoft Excel: A systematic replication and extension
Kimberley L. M. Zonneveld, Alison D. Cox, Madeline M. Asaro, Kieva S. Hranchuk, Arezu Alami, Laura D. Kelly, Jan C. Frijters
{"title":"Comparing instructor-led, video-model, and no-instruction control tutorials for creating single-subject graphs in Microsoft Excel: A systematic replication and extension","authors":"Kimberley L. M. Zonneveld, Alison D. Cox, Madeline M. Asaro, Kieva S. Hranchuk, Arezu Alami, Laura D. Kelly, Jan C. Frijters","doi":"10.1002/jaba.1053","DOIUrl":null,"url":null,"abstract":"<p>Visual inspection of single-subject data is the primary method for behavior analysts to interpret the effect of an independent variable on a dependent variable; however, there is no consensus on the most suitable method for teaching graph construction for single-subject designs. We systematically replicated and extended Tyner and Fienup (2015) using a repeated-measures between-subjects design to compare the effects of instructor-led, video-model, and no-instruction control tutorials on the graphing performance of 81 master's students with some reported Microsoft Excel experience. Our mixed-design analysis revealed a statistically significant main effect of pretest, tutorial, and posttest submissions for each tutorial group and a nonsignificant main effect of tutorial group. Tutorial group significantly interacted with submissions, suggesting that both instructor-led and video-model tutorials may be superior to providing graduate students with a written list of graphing conventions (i.e., control condition). Finally, training influenced performance on an untrained graph type (multielement) for all tutorial groups.</p>","PeriodicalId":14983,"journal":{"name":"Journal of applied behavior analysis","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of applied behavior analysis","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/jaba.1053","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, CLINICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Visual inspection of single-subject data is the primary method for behavior analysts to interpret the effect of an independent variable on a dependent variable; however, there is no consensus on the most suitable method for teaching graph construction for single-subject designs. We systematically replicated and extended Tyner and Fienup (2015) using a repeated-measures between-subjects design to compare the effects of instructor-led, video-model, and no-instruction control tutorials on the graphing performance of 81 master's students with some reported Microsoft Excel experience. Our mixed-design analysis revealed a statistically significant main effect of pretest, tutorial, and posttest submissions for each tutorial group and a nonsignificant main effect of tutorial group. Tutorial group significantly interacted with submissions, suggesting that both instructor-led and video-model tutorials may be superior to providing graduate students with a written list of graphing conventions (i.e., control condition). Finally, training influenced performance on an untrained graph type (multielement) for all tutorial groups.