{"title":"在线概念评估中的学生行为与考试安全","authors":"Bethany R. Wilcox, S. Pollock","doi":"10.1119/perc.2019.pr.Wilcox","DOIUrl":null,"url":null,"abstract":"Historically, the implementation of research-based assessments (RBAs) has been a driver of education change within physics and helped motivate adoption of interactive engagement pedagogies. Until recently, RBAs were given to students exclusively on paper and in-class; however, this approach has important drawbacks including decentralized data collection and the need to sacrifice class time. Recently, some RBAs have been moved to online platforms to address these limitations. Yet, online RBAs present new concerns such as student participation rates, test security, and students' use of outside resources. Here, we report on a pilot study addressing these concerns. We gave two upper-division RBAs to courses at five institutions; the RBAs were hosted online and featured embedded JavaScript code which collected information on students' behaviors (e.g., copying text, printing). With these data, we examine the prevalence of these behaviors, and their correlation with students' scores, to determine if online and paper-based RBAs are comparable. We find that browser loss of focus is the most common online behavior while copying and printing events were rarer.We found no statistically significant correlation between any of these online behaviors and students scores. We also found that participation rates for our upper-division population went up when the RBA was given online. These results indicates that, for our upper-division population, scores on online administrations of these RBAs were comparable to in-class versions.","PeriodicalId":208063,"journal":{"name":"2019 Physics Education Research Conference Proceedings","volume":"54 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Student behavior and test security in online conceptual assessment\",\"authors\":\"Bethany R. Wilcox, S. Pollock\",\"doi\":\"10.1119/perc.2019.pr.Wilcox\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Historically, the implementation of research-based assessments (RBAs) has been a driver of education change within physics and helped motivate adoption of interactive engagement pedagogies. Until recently, RBAs were given to students exclusively on paper and in-class; however, this approach has important drawbacks including decentralized data collection and the need to sacrifice class time. Recently, some RBAs have been moved to online platforms to address these limitations. Yet, online RBAs present new concerns such as student participation rates, test security, and students' use of outside resources. Here, we report on a pilot study addressing these concerns. We gave two upper-division RBAs to courses at five institutions; the RBAs were hosted online and featured embedded JavaScript code which collected information on students' behaviors (e.g., copying text, printing). With these data, we examine the prevalence of these behaviors, and their correlation with students' scores, to determine if online and paper-based RBAs are comparable. We find that browser loss of focus is the most common online behavior while copying and printing events were rarer.We found no statistically significant correlation between any of these online behaviors and students scores. We also found that participation rates for our upper-division population went up when the RBA was given online. These results indicates that, for our upper-division population, scores on online administrations of these RBAs were comparable to in-class versions.\",\"PeriodicalId\":208063,\"journal\":{\"name\":\"2019 Physics Education Research Conference Proceedings\",\"volume\":\"54 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 Physics Education Research Conference Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1119/perc.2019.pr.Wilcox\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Physics Education Research Conference Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1119/perc.2019.pr.Wilcox","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Student behavior and test security in online conceptual assessment
Historically, the implementation of research-based assessments (RBAs) has been a driver of education change within physics and helped motivate adoption of interactive engagement pedagogies. Until recently, RBAs were given to students exclusively on paper and in-class; however, this approach has important drawbacks including decentralized data collection and the need to sacrifice class time. Recently, some RBAs have been moved to online platforms to address these limitations. Yet, online RBAs present new concerns such as student participation rates, test security, and students' use of outside resources. Here, we report on a pilot study addressing these concerns. We gave two upper-division RBAs to courses at five institutions; the RBAs were hosted online and featured embedded JavaScript code which collected information on students' behaviors (e.g., copying text, printing). With these data, we examine the prevalence of these behaviors, and their correlation with students' scores, to determine if online and paper-based RBAs are comparable. We find that browser loss of focus is the most common online behavior while copying and printing events were rarer.We found no statistically significant correlation between any of these online behaviors and students scores. We also found that participation rates for our upper-division population went up when the RBA was given online. These results indicates that, for our upper-division population, scores on online administrations of these RBAs were comparable to in-class versions.