This year marks the end of an era for me as my youngest child has just been through NAPLAN testing for the last time. His results were a lot better than I expected, based on his description of what he wrote in response to the writing prompt. But it turns out that I needn’t have given them a second thought anyway – according to the ‘What impact will the results have on my child’s future?’ section of the official letter that accompanied his report, “…NAPLAN tests are ‘low stakes’ for students as there are no rewards or punishments for the result a student achieves” (ACARA, 2015). What a relief – I can sleep soundly again!
Because if you believe the media and our politicians, NAPLAN tests are anything but low stakes. In the last few weeks since this year’s NAPLAN statistics have been released, there has been a barrage of incendiary headlines about the testing and what it means – “NAPLAN: NSW ‘10 to 15 years’ behind the world’s best” (Smith, 2015), ‘Teachers sound alarm on NAPLAN, Gonski’ (Henebery, 2015), “NAPLAN: Shakespeare would have failed the Year 9 literacy test” (James, 2015) and “Child abducted by aliens gets perfect NAPLAN score”. Obviously I made that last one up, but the hysteria surrounding NAPLAN and what it might and does mean, is very real.
How did we get to a point where NAPLAN has become the measure of a school’s worth? The short answer is politics. The intent behind NAPLAN was ostensibly honourable – for the first time we would have a national benchmarking tool to help our disparate education systems to identify areas for school improvement focused on students’ literacy and numeracy achievements. It was conceived by the Howard Government, with then Federal Education Minister Julie Bishop obtaining agreement from the states and territories for common testing, and while the first assessments were developed in 2007, it was the Rudd Government who administered them in 2008. The concept of standardised testing was not new – it already existed in different forms across the jurisdictions – but this was the first time that students across the country were assessed using the same instruments.
The original report released by the Australian Curriculum, Reporting and Assessment Authority (ACARA) in the same year assured us that, “By locating all students on a single national scale, which maps the skills and understandings assessed, each scale provides significantly more information about the literacy and numeracy achievement of students than was previously available” (ACARA, 2008). And at a system level, this was undoubtedly true. The report revealed startling differences between achievement levels from one jurisdiction to another that refocused attention in states such as Queensland on the critical task of improving student performance or as the ACARA website suggests, “At the system level, the NAP provides education ministers with information about the success of their policies and resourcing in priority curriculum areas.”
From the beginning, however, at an individual student level, and even to some degree at a school level, the results have had limited value. In large part, this was, and still is, due to the time lag of around four months between when students sit the tests and when the results become available. While the indicative achievement levels of students may possibly still be relevant, one would hope that after more than a term the specifics of what students do or do not know would no longer be current. I would contend that the general achievement levels of specific students would come as no surprise to teachers, parents, and most likely the students themselves, who are all too aware who is likely to perform well in a particular area and who is likely to struggle without NAPLAN telling them, and their value is also therefore limited.
One of the other major limitations of NAPLAN is that most schools are either F to Year 6 or Year 7 to Year 12, and therefore the data in any one year is only relevant to two out of six or seven year levels. It might give a broad picture of how the school is performing in the topics assessed, but NAPLAN does not take into account the natural variance in cohorts from one year to the next. So while over time the trend data might help a school identify that there is a weakness or a strength in numeracy, for example, care must be taken not to make schoolwide decisions based on the results of a single cohort in a single year. Indeed with a growing trend towards effective data collection and use, schools who have robust assessment programs across the board would already be aware of any areas that require attention anyway. NAPLAN simply represents a confirmation of this.
It was with the launch of the My School website in 2010 that the hype surrounding NAPLAN really escalated. Championed by Education Minister Julia Gillard, the website promised transparency and unprecedented information to help parents make informed decisions about their children’s education.
From the beginning it proved controversial, with media outlets rushing to construct league tables to show us the best and worst schools in the country. Of course, even with the complicated Index of Community Socio-Educational Advantage (ICSEA) used to compare ‘like’ schools, what was already obvious to most was made more obvious – that the highest performing schools generally came from areas with higher socio economic profiles and vice versa. Anecdotal reports indicate that the impact of the adverse media attention for struggling schools was in some cases devastating. The My School rankings failed to recognise what any good teacher is at pains to acknowledge – improvement – and schools who had started on a very low base were still identified publically as low-performing, even if they had made remarkable gains.
Parent interaction with the website is interesting to contemplate. While the red and green colour coding against like-schools is designed to make interpretation of a school’s performance simple, in reality, a series of complex factors underpin these results. This is evident on the My School website which, in addition to a general fact sheet (available in 21 community languages) and FAQ page, offers no fewer than 15 other reports and explanatory sources in its ‘More information’ section. These include detailed guides to understanding student gains and the ICSEA as well as information about interpreting school financial information and NAPLAN results. That the website is trying to give a broader picture of schools than a NAPLAN snapshot is laudable, but it is doubtful that the majority of the site’s users have more than a rudimentary understanding of what they are reading.
High stakes tests can be defined in the following way: “In general, ‘high stakes’ means that test scores are used to determine punishments (such as sanctions, penalties, funding reductions, negative publicity), accolades (awards, public celebration, positive publicity), advancement (grade promotion or graduation for students), or compensation (salary increases or bonuses for administrators and teachers)” (Concepts, 2013). My School, along with the increased media focus on NAPLAN results, are in large part responsible for pushing NAPLAN into the high stakes realm.
Theoretically NAPLAN testing for individual students should, as ACARA asserts, be low stakes. In reality, there is mounting evidence that the tests are causing undue stress for students. A Sydney Morning Herald article in April this year, one of several similar pieces, reported that many psychologists are experiencing higher than usual demand in the lead up to NAPLAN testing as “students as young as eight are getting so anxious they are vomiting” (Bagshaw, 2015). Schools that put undue focus on NAPLAN preparation rather than devoting their energy to evidence-based teaching and learning practices are surely sending a message that what happens in these tests really matters.
The burgeoning NAPLAN preparation materials market is also testimony to the fact that despite the good intentions, NAPLAN is being seen by parents and by students as more than an everyday indication of where students and schools are at. Certainly the assessments have punishments and accolades attached to them in the form of negative or positive publicity for schools, and requests by secondary schools for NAPLAN results from potential students can add to the pressure on individuals, even if these are not necessarily used as a basis for entry. Possibly some of the hysteria stems from modern parenting styles that seek to protect children from any form of adversity, but even so, the anxiety that NAPLAN can cause the key stakeholders of students, parents and schools is indicative that they have evolved beyond the realm of low stakes.
One of the glaring conundrums with NAPLAN that causes if not stress, perhaps exasperation for teachers, is that it is not linked to the Australian Curriculum. Instead, the tests are tied to a series of ‘Statements of learning’ that are “a description of knowledge, skills, understandings and capacities that all students in Australia should have the opportunity to learn” (Curriculum Corporation, 2005). It would be far easier for teachers to feel confident that they are covering all the required content for NAPLAN in the normal course of teaching if the assessments reflected what teachers are mandated to teach. Despite promises to rectify this over the years, these two fundamental segments of our education system remain disconnected.
There are, however, some enhancements planned for NAPLAN, most notably the move to offering the assessments online. The online version seeks to deliver adaptive testing that after an initial set of common questions will offer students questions better tailored to their current abilities. The targetted tests will then produce more accurate data that can be used to drive teaching. The main advantage is, of course, that results will be made available much faster, so that there is greater potential for diagnostic use on an individual, class and school basis.
Most of this seems to be good news, but for me, the worrying part is that it is anticipated that writing will be marked by computer. A few years ago in response to the writing prompt, ‘Choose a hero who you think deserves an award’ my daughter wrote in great detail about NAPLAN markers. She discussed how they have to attend long training sessions with unappetising lunches, how they have to know the correct use of the word ‘myriad’, and finished by urging the person marking her paper to turn round and give themselves a pat on the back.
In short, she knew her audience! And she scored very well. I wonder if a computer could appreciate the complexity of what she achieved, introducing irony and humour in a way that takes an insider to understand. There is little doubt that a computer could more accurately and quickly assess the quantifiable aspects of writing, such as spelling errors, but I doubt even a very well-programmed machine could appreciate the subtleties of human communication.
And it is the human element that is the most critical in education. New research is showing the relationships that teachers are able to build with students play an essential role in student success. These connections are not built through test preparation, but through an appreciation of each child as a whole; what they have to offer not just in terms of academic achievement, but as a human being. On top of the official explanation of NAPLAN 2015 Student Reports that came with my son’s results this year was a letter from his school, the Plenty Campus of Ivanhoe Grammar, that captures the importance of this perfectly:
“We would like you to see the results of these tests as a small part of the full picture of your Ivanhoe child. What they do not assess is what makes your child special and unique. They do not know that your child plays a musical instrument, kicked the winning goal, joined the choir, grew from a ‘D’ to a ‘C’ in Maths, is kind and respectful, tries hard and reaches high …So these results are just one snapshot of what your child could do at one particular moment in time with regard to specific tasks.”
The undue attention given to NAPLAN at individual and school level by politicians and the media loses sight of this fundamental truth and undermines the potential value of NAPLAN as simply a barometer for our nation’s education system and one of many tools that may help identify the next steps in a school or student’s learning journey.

Further reading
ACARA,. National Assessment Program - Literacy And Numeracy: Achievement In Reading, Writing, Language Conventions And Numeracy. 2008. Web. 20 Sept. 2015.
ACARA,. NAPLAN 2015 Student Reports. 2015. Print.
Bagshaw, Eryk. ‘NAPLAN: Parents And Teachers Urged To Calm Students Down’. Sydney Morning Herald 2015. Web. 20 Sept. 2015.
Concepts, Liberty. ‘High-Stakes Test Definition’. The Glossary of Education Reform. N.p., 2013. Web. 20 Sept. 2015.
Curriculum Corporation,. Statements Of Learning For English. Carlton South: Curriculum Corporation, 2005. Print.
James, David. ‘NAPLAN: Shakespeare Would Have Failed The Year 9 Literacy Test’. The Age 2015. Web. 12 Sept. 2015.
Henebery, Brett. ‘Teachers Sound Alarm On NAPLAN, Gonski’. The Educator 2015. Web. 12 Sept. 2015.
Smith, Alexandra. ‘NAPLAN: NSW ‘10 To 15 Years’ Behind The World’s Best’. Sydney Morning Herald 2015. Web. 12 Sept. 2015.