Experts look for utility in NAPLAN

After a year off, NAPLAN is back. Many experts are asking why, but there could be some interesting findings outside of the very similar data that each year's testing has yielded.
May 12, 2021
Experts
Largely skeptical

After a year off, NAPLAN is back. Many experts are asking why, but some concede that there could be value in the data if the right questions are asked.

A comparison between results this year and previous years might be used to determine whether schooling is as effective as it could be.

Professor Jihyun Lee of UNSW Sydney is an expert on large-scale assessments like NAPLAN and PISA says, "NAPLAN data for this year will create a valuable opportunity to evaluate schooling. 

"If NAPLAN results have gone down this year, it will suggest schooling effectiveness because the disruption last year indeed had negative impact on student learning.

"If NAPLAN results have gone up, Australia may need to consider more rigorous educational policies promoting student-initiatives and parental involvement in the mainstream schooling.

"However, NAPLAN results may turn out to be similar to previous years. Like the average height or weight, population-based testing results do not tend to change much from year-to-year. Thus, similar results should not be taken as evidence of school ineffectiveness."

Professor Pasi Sahlberg deputy director of UNSW Sydney’s Gonski Institute isn’t as positive when asked about NAPLAN’s return.

"NAPLAN is desperately outdated and needs to be replaced," he says.

"Debates about standardised testing in many countries have been recharged during the COVID pandemic. Even before that some of the advanced education systems were reconsidering their old national student assessment policies and replacing bureaucratic census-based tests with more intelligent student assessment models that include sample-based tests for system monitoring purposes combined with classroom-based assessment and school-led evaluations.

"In this respect, NAPLAN is desperately outdated and needs to be replaced by a better national assessment system that serves specific purposes that are clearly and purposefully defined. This new national assessment system should also have more elements that would serve student learning and wellbeing, teaching in classrooms, and school improvement."

A criticism of the testing is that it does not assess individual children with high precision and Associate Professor David Curtin, Flinders University  and formerly of Australian Council for Educational Research and the National Centre for Vocational Education Research says, "School factors only account for 20 per cent of achievement and interventions need to realise that.

"NAPLAN does not assess individual children with high precision. NAPLAN lacks precision for reporting individual student achievement, but when aggregated, does have adequate precision for monitoring purposes.

"School factors account for approximately 20 per cent of the variation between students and non-school factors approximately 80 per cent. Education policy interventions that focus on schools and teachers taps the 20 per cent and ignores the more difficult 80 per cent factors.

"Most variation in student achievement is attributable to non-school factors therefore schools with low average achievement should not be blamed for it and schools with high average achievement should not take the credit for it."

Dr Katina Zammit, Deputy Dean in the School of Education, Western Sydney University adds that NAPLAN won’t do anything to improve the education system.

"Achievement in NAPLAN is not going to improve Australia’s education systems, its international standing or instructional practices.

"The OECD's Programme for International Student Assessment (PISA) measures 15-year-olds’ ability to use their reading, mathematics and science knowledge and skills to meet real-life challenges. NAPLAN doesn’t do this so we can’t compare NAPLAN’s version of student achievement with international standards or use it as a benchmark for international rankings. The results for students from low socio-economic backgrounds will be lower than those in more advantaged areas.