{"title":"我们真的落后了吗?澳大利亚高中科学国际和地方标准化考试关键指标的比较","authors":"Helen Georgiou","doi":"10.1007/s11165-023-10129-2","DOIUrl":null,"url":null,"abstract":"<p>There has been a strong narrative in Australia of falling attainment in high school science, with much of the campaign informed by results from international standardised tests such as Programme for International Student Assessment (PISA), which shows a year-on-year decline in scientific literacy of Australian 15-year-old students. These results have been used to justify significant policy and curriculum reform, despite the known limitations of PISA and a lack of additional evidence to support this decline in other tests. In this paper, results from standardised tests administered in Australia will be compared to create a fulsome picture of attainment for high school science students. Reports include both the compilation of data from existing reports and new analyses. With the latest (2018/9) reports from PISA, Trends in International Mathematics and Science Study (TIMSS), and National Assessment Program for Scientific Literacy (NAP-SL) (an Australian test of Science Literacy) and data shared by the NSW Department of Education on ‘The Validation of Assessment for Learning and Individual Development’ (VALID) test for the years 2015, 2016, 2017, and 2018, this offers the most complete picture of student attainment in science to date. Results show that there are disagreements between tests on cohort achievement over time and distribution of attainment at different ‘proficiency levels’. These results suggest caution when using these key results from these tests to inform policy and pedagogy.</p>","PeriodicalId":47988,"journal":{"name":"Research in Science Education","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2023-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Are We Really Falling Behind? Comparing Key Indicators Across International and Local Standardised Tests for Australian High School Science\",\"authors\":\"Helen Georgiou\",\"doi\":\"10.1007/s11165-023-10129-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>There has been a strong narrative in Australia of falling attainment in high school science, with much of the campaign informed by results from international standardised tests such as Programme for International Student Assessment (PISA), which shows a year-on-year decline in scientific literacy of Australian 15-year-old students. These results have been used to justify significant policy and curriculum reform, despite the known limitations of PISA and a lack of additional evidence to support this decline in other tests. In this paper, results from standardised tests administered in Australia will be compared to create a fulsome picture of attainment for high school science students. Reports include both the compilation of data from existing reports and new analyses. With the latest (2018/9) reports from PISA, Trends in International Mathematics and Science Study (TIMSS), and National Assessment Program for Scientific Literacy (NAP-SL) (an Australian test of Science Literacy) and data shared by the NSW Department of Education on ‘The Validation of Assessment for Learning and Individual Development’ (VALID) test for the years 2015, 2016, 2017, and 2018, this offers the most complete picture of student attainment in science to date. Results show that there are disagreements between tests on cohort achievement over time and distribution of attainment at different ‘proficiency levels’. These results suggest caution when using these key results from these tests to inform policy and pedagogy.</p>\",\"PeriodicalId\":47988,\"journal\":{\"name\":\"Research in Science Education\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2023-09-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Research in Science Education\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1007/s11165-023-10129-2\",\"RegionNum\":3,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research in Science Education","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1007/s11165-023-10129-2","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Are We Really Falling Behind? Comparing Key Indicators Across International and Local Standardised Tests for Australian High School Science
There has been a strong narrative in Australia of falling attainment in high school science, with much of the campaign informed by results from international standardised tests such as Programme for International Student Assessment (PISA), which shows a year-on-year decline in scientific literacy of Australian 15-year-old students. These results have been used to justify significant policy and curriculum reform, despite the known limitations of PISA and a lack of additional evidence to support this decline in other tests. In this paper, results from standardised tests administered in Australia will be compared to create a fulsome picture of attainment for high school science students. Reports include both the compilation of data from existing reports and new analyses. With the latest (2018/9) reports from PISA, Trends in International Mathematics and Science Study (TIMSS), and National Assessment Program for Scientific Literacy (NAP-SL) (an Australian test of Science Literacy) and data shared by the NSW Department of Education on ‘The Validation of Assessment for Learning and Individual Development’ (VALID) test for the years 2015, 2016, 2017, and 2018, this offers the most complete picture of student attainment in science to date. Results show that there are disagreements between tests on cohort achievement over time and distribution of attainment at different ‘proficiency levels’. These results suggest caution when using these key results from these tests to inform policy and pedagogy.
期刊介绍:
2020 Five-Year Impact Factor: 4.021
2020 Impact Factor: 5.439
Ranking: 107/1319 (Education) – Scopus
2020 CiteScore 34.7 – Scopus
Research in Science Education (RISE ) is highly regarded and widely recognised as a leading international journal for the promotion of scholarly science education research that is of interest to a wide readership.
RISE publishes scholarly work that promotes science education research in all contexts and at all levels of education. This intention is aligned with the goals of Australasian Science Education Research Association (ASERA), the association connected with the journal.
You should consider submitting your manscript to RISE if your research:
Examines contexts such as early childhood, primary, secondary, tertiary, workplace, and informal learning as they relate to science education; and
Advances our knowledge in science education research rather than reproducing what we already know.
RISE will consider scholarly works that explore areas such as STEM, health, environment, cognitive science, neuroscience, psychology and higher education where science education is forefronted.
The scholarly works of interest published within RISE reflect and speak to a diversity of opinions, approaches and contexts. Additionally, the journal’s editorial team welcomes a diversity of form in relation to science education-focused submissions. With this in mind, RISE seeks to publish empirical research papers.
Empircal contributions are:
Theoretically or conceptually grounded;
Relevant to science education theory and practice;
Highlight limitations of the study; and
Identify possible future research opportunities.
From time to time, we commission independent reviewers to undertake book reviews of recent monographs, edited collections and/or textbooks.
Before you submit your manuscript to RISE, please consider the following checklist. Your paper is:
No longer than 6000 words, including references.
Sufficiently proof read to ensure strong grammar, syntax, coherence and good readability;
Explicitly stating the significant and/or innovative contribution to the body of knowledge in your field in science education;
Internationalised in the sense that your work has relevance beyond your context to a broader audience; and
Making a contribution to the ongoing conversation by engaging substantively with prior research published in RISE.
While we encourage authors to submit papers to a maximum length of 6000 words, in rare cases where the authors make a persuasive case that a work makes a highly significant original contribution to knowledge in science education, the editors may choose to publish longer works.