{"title":"差异分散因子作为解释DIF的一种方法——以沙特阿拉伯的一次全国招生考试为例","authors":"I. Tsaousis, G. Sideridis, Fahad Al-Saawi","doi":"10.1080/15305058.2017.1345914","DOIUrl":null,"url":null,"abstract":"The aim of the present study was to examine Differential Distractor Functioning (DDF) as a means of improving the quality of a measure through understanding biased responses across groups. A DDF analysis could shed light on the potential sources of construct-irrelevant variance by examining whether the differential selection of incorrect choices (distractors), attracts various groups in different ways. To examine possible DDF effects, a method introduced by Penfield (2008, 2010a), based on odds ratio estimators, was utilized. Items from the Chemistry sub-scale of the Standard Achievement Admission Test (SAAT) in Saudi Arabia were used as an example. Statistical evidence for differential item functioning (DIF_ existed for five items, at either moderate or strong levels. Particularly three items (i.e., items 45, 54, and 61), reached category B levels (i.e., moderate DIF), and two items (items 51and 60) category C levels (strong DIF) based on Educational Testing Service guidelines. These items were then examined more closely for DDF in an attempt to potentially understand the causes of DIF and group biased responses. The manuscript concludes with a series of remedial actions, based on distractor-relevant information, with the goal of improving the psychometric properties of an instrument under study.","PeriodicalId":46615,"journal":{"name":"International Journal of Testing","volume":"18 1","pages":"1 - 26"},"PeriodicalIF":1.0000,"publicationDate":"2018-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/15305058.2017.1345914","citationCount":"9","resultStr":"{\"title\":\"Differential Distractor Functioning as a Method for Explaining DIF: The Case of a National Admissions Test in Saudi Arabia\",\"authors\":\"I. Tsaousis, G. Sideridis, Fahad Al-Saawi\",\"doi\":\"10.1080/15305058.2017.1345914\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The aim of the present study was to examine Differential Distractor Functioning (DDF) as a means of improving the quality of a measure through understanding biased responses across groups. A DDF analysis could shed light on the potential sources of construct-irrelevant variance by examining whether the differential selection of incorrect choices (distractors), attracts various groups in different ways. To examine possible DDF effects, a method introduced by Penfield (2008, 2010a), based on odds ratio estimators, was utilized. Items from the Chemistry sub-scale of the Standard Achievement Admission Test (SAAT) in Saudi Arabia were used as an example. Statistical evidence for differential item functioning (DIF_ existed for five items, at either moderate or strong levels. Particularly three items (i.e., items 45, 54, and 61), reached category B levels (i.e., moderate DIF), and two items (items 51and 60) category C levels (strong DIF) based on Educational Testing Service guidelines. These items were then examined more closely for DDF in an attempt to potentially understand the causes of DIF and group biased responses. The manuscript concludes with a series of remedial actions, based on distractor-relevant information, with the goal of improving the psychometric properties of an instrument under study.\",\"PeriodicalId\":46615,\"journal\":{\"name\":\"International Journal of Testing\",\"volume\":\"18 1\",\"pages\":\"1 - 26\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2018-01-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/15305058.2017.1345914\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Testing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/15305058.2017.1345914\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SOCIAL SCIENCES, INTERDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Testing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/15305058.2017.1345914","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
Differential Distractor Functioning as a Method for Explaining DIF: The Case of a National Admissions Test in Saudi Arabia
The aim of the present study was to examine Differential Distractor Functioning (DDF) as a means of improving the quality of a measure through understanding biased responses across groups. A DDF analysis could shed light on the potential sources of construct-irrelevant variance by examining whether the differential selection of incorrect choices (distractors), attracts various groups in different ways. To examine possible DDF effects, a method introduced by Penfield (2008, 2010a), based on odds ratio estimators, was utilized. Items from the Chemistry sub-scale of the Standard Achievement Admission Test (SAAT) in Saudi Arabia were used as an example. Statistical evidence for differential item functioning (DIF_ existed for five items, at either moderate or strong levels. Particularly three items (i.e., items 45, 54, and 61), reached category B levels (i.e., moderate DIF), and two items (items 51and 60) category C levels (strong DIF) based on Educational Testing Service guidelines. These items were then examined more closely for DDF in an attempt to potentially understand the causes of DIF and group biased responses. The manuscript concludes with a series of remedial actions, based on distractor-relevant information, with the goal of improving the psychometric properties of an instrument under study.