{"title":"Effective Reporting for Formative Assessment","authors":"Gavin T. L. Brown, T. O'Leary, J. Hattie","doi":"10.4324/9781351136501-9","DOIUrl":"https://doi.org/10.4324/9781351136501-9","url":null,"abstract":"Assessment should have a purpose. As Zumbo (2009) stated, in the context of discussing validity, ‘it is rare that that anyone measures for the sheer delight’ (p. 66) going on to concede that measurement is ‘something you do so that you can use the outcomes’ (p. 66). Within educational contexts, there are many ways testing might be expected to used and improve schooling (Haertel, 2013), as well as many ways users might anticipate using test results (Hopster-den Otter, Wools, Eggen, & Veldkamp, 2016). One key use, perhaps the primary use, of educational assessment is the support of student learning (Popham, 2000). Given such improvement purposes for tests, validity requires that reports on student performance be well aligned to the test (and the test well aligned to the intended curricular goals) and well designed to ensure understanding (Tannenbaum, this volume). In any system that expects teachers to monitor and respond to student learning, teachers are important users of test information. In such systems, the teacher’s role is primarily to mediate test score information into appropriate instructional decisions (e.g., pace of progress, student grouping, task and activity design, selection of curricular resources, etc.). The focus of this chapter is on the communication of test results to teachers in ways that foster interpretations and actions that align with those intended. Shepard (2001, 2006) makes it clear that most educational assessment is carried out in classrooms by teachers and that significant improvements are needed in how testing might continue to play a part in that process. Teachers are expected to make a series of qualitative interpretations about observed student performances, as well as interpretations of test scores (Kane, 2006). These interpretations occur as teachers interact with students in the classroom and are not simply recorded for later interpretation. While modern directions in assessment design focus on ensuring that a robust theory of learning or cognition is present (Pellegrino, Chudowsky, Glaser, & National Research Council, 2001), it seems more appropriate in evaluating test reports for teachers to focus on theories of effective communication and instructional action. Within educational settings, the first goal of a diagnostic test score report should be to ensure that the test reports inform teachers’ decision-making about ‘who needs to be taught what next’ (Brown & Hattie, 2012). Extensive research on feedback (Hattie & Timperley, 2007) shows that 8 Effective Reporting for Formative Assessment The asTTle Case Example","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121722507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Applying Learning Analytics to Support Instruction","authors":"Mingyu Feng, Andrew E. Krumm, Shuchi Grover","doi":"10.4324/9781351136501-10","DOIUrl":"https://doi.org/10.4324/9781351136501-10","url":null,"abstract":"This chapter highlights the ways in which learning analytics can be used to better understand and improve learning environments, instruction, and assessment (Siemens & Long, 2011). As a set of approaches for engaging in educational research, learning analytics and educational data mining represent relatively new modes of inquiry. The growth of these approaches maps closely to the availability of new forms of data being collected and stored in digital learning environments, administrative data systems, as well as sensors and recording devices. Moreover, the growth of these fields maps closely onto what the National Science Foundation refers to as “data-intensive research,” which encompasses more than learning analytics and educational data mining to include a broad range of social and physical sciences. As new forms of data have emerged (i.e., transaction level data from digital learning environments as well as digital forms of audio, video, and text) and been collected at ever increasing scales, there has been an explosion of efforts to make use of these data for the purposes of research. By and large, most early work beginning in the mid-2000s was directed at exploring research questions that were tractable within highly structured, well-designed digital learning environments like intelligent tutoring systems (ITS; e.g., Koedinger, Anderson, Hadley, & Mark, 1997; VanLehn et al., 2005). The tight alignment between the learning tasks students were expected to engage in and the data that were collected in these environments made them ideal for exploring not just the outcomes of learning but the various ways in which students engaged in learning activities. A basic insight from these early researchers continues to fuel research and efforts to improve instruction—data on students’ learning processes is as useful and sometimes more so than data on students’ learning outcomes. In this chapter, we expand upon this insight and highlight the ways in which data from digital learning environments, administrative data systems, and sensors as well as recording devices can be used to support instruction in real classrooms by reporting on students’ learning activities through various data products (e.g., dashboards). We do so across four cases that represent varying degrees of proximity to instruction. By highlighting these varying degrees of proximity, we intend to demonstrate the multiple ways in which learning analytics can be used to support instruction. Cases 1 and 2 describe efforts to use learning analytics to support instruction 9 Applying Learning Analytics to Support Instruction","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127877712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Validity Aspects of Score Reporting","authors":"Richard J. Tannenbaum","doi":"10.4324/9781351136501-2","DOIUrl":"https://doi.org/10.4324/9781351136501-2","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"222 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121272412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Score Reporting Issues for Licensure, Certification, and Admissions Programs","authors":"Francis O'Donnell, S. Sireci","doi":"10.4324/9781351136501-7","DOIUrl":"https://doi.org/10.4324/9781351136501-7","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129997894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Score Reports for Large-scale Testing Programs","authors":"S. Slater, Samuel A. Livingston, M. Silver","doi":"10.4324/9781351136501-8","DOIUrl":"https://doi.org/10.4324/9781351136501-8","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115043137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reporting Student Growth","authors":"April L. Zenisky, L. Keller, Yooyoung Park","doi":"10.4324/9781351136501-5","DOIUrl":"https://doi.org/10.4324/9781351136501-5","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122295579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Advances in Cognitive Science and Information Visualization","authors":"M. Hegarty","doi":"10.4324/9781351136501-3","DOIUrl":"https://doi.org/10.4324/9781351136501-3","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"120 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127280079","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Sinharay, Gautam Puhan, S. Haberman, R. Hambleton
{"title":"Subscores","authors":"S. Sinharay, Gautam Puhan, S. Haberman, R. Hambleton","doi":"10.4324/9781351136501-4","DOIUrl":"https://doi.org/10.4324/9781351136501-4","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121844130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluating Students’ Interpretation of Feedback in Interactive Dashboards","authors":"L. Corrin","doi":"10.4324/9781351136501-11","DOIUrl":"https://doi.org/10.4324/9781351136501-11","url":null,"abstract":"Dashboards have long been used in business and engineering fields to provide users with a consolidated view of data to inform decision making. These decision makers are most often experts in their profession (for example, sales managers in business or pilots in engineering), who bring their expertise into the process of interpreting the data provided through the dashboard view. Dashboards are designed to use data to communicate information about areas that may need attention and action (Few, 2013). The rise of ‘big data’ across many industries has prompted new and innovative approaches to bringing together and displaying this data in ways that are meaningful and informative. With increasing amounts of data being collected about students’ behaviour in learning environments, it is therefore not surprising that the idea of building dashboards to provide an overview of student progress and performance has also become popular in education, sparking a range of dashboard development for students across all stages of education. In the educational context, learning dashboards have been defined as: ‘a single display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualisations’ (Schwendimann et al., 2017). While the majority of dashboards developed in education initially focused on providing information to teachers and administrators, an increasing number of student-facing dashboards are starting to emerge. For students, dashboards provide an opportunity to gain feedback on their learning activities and assessments, providing evidence to inform decisions around how they approach their study. Many universities, schools, learning management system vendors, and other educational technology companies are currently exploring innovative ways to deliver interactive dashboards to students which incorporate useful information displayed in ways that are easily interpretable by students. However, there is an emerging concern about students’ ability to interpret the data provided in dashboards in a way that is beneficial to their learning (Clow, 2013; Corrin & de Barba, 2014; Teasley, 2017). Research into student dashboards, to date, has tended to focus on measuring an increase in grade or a decrease in attrition in cohorts of students who have had access to a dashboard (Arnold & Pistilli, 2012). Other studies have sought students’ opinions about what 10 Evaluating Students’ Interpretation of Feedback in Interactive Dashboards","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"444 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126366041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Communicating Measurement Error Information to Teachers and Parents","authors":"Diego Zapata-Rivera, P. Kannan, R. Zwick","doi":"10.4324/9781351136501-6","DOIUrl":"https://doi.org/10.4324/9781351136501-6","url":null,"abstract":"","PeriodicalId":308864,"journal":{"name":"Score Reporting Research and Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125455307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}