September in Australia marks the end of the cold winter and the promise of better things to come. As the die-hard sports fans feverishly anticipated the long-awaited finals, the die-hard educationalists waited patiently for the release of the National Assessment Program – Literacy and Numeracy (NAPLAN) results, their four-month-long vigil finally at an end.

The 2011 NAPLAN results, as is the custom, were launched amid a flurry of media releases, interviews and statements from the key players on the education field. The media centre website of the Federal Minister for School Education, Peter Garrett, proudly proclaimed that “Nine in 10 Aussie kids meet national literacy and numeracy standards”, and many of the state Education Departments ran similar stories lauding their achievements. Statistics, however, while appearing straightforward can be as difficult to get a firm grasp of as a football in a hailstorm. What follows is an examination of the spin and what NAPLAN results really mean to some of the key players in education.

A cursory look at NAPLAN scores around the country, as revealed in the preliminary 2011 NAPLAN Summary Report, is generally encouraging. On a national level, the 2011 results for Year 3, for example, show an increase in the percentage of students reaching the minimum benchmark as against 2008 results in all categories except Writing, which, of course, can be explained by the change in genre this year – more on that later. Mr Garrett is quick to point out these ‘gains’ in his media release, highlighting that “since the NAPLAN tests began we can now see that there have been statistically significant improvements nationally in Year 3 reading, spelling, grammar and punctuation, and Year 5 numeracy.”

Looking at the figures from another angle, however, reveals a whole new story. Most reviews of the NAPLAN data to date have focused on comparing the results of a particular year level, such as Year 3, in each area at each year of testing. While this may give a picture of how literacy and numeracy is progressing on a broad scale, it is a comparison of entirely different cohorts of students. Every school knows that ‘Year 3’ is not a homogenous breed of student that should yield the same results every year, but that each cohort is unique, presenting its own strengths and challenges. With a sample size as large as Australia, it is conceivable that the variance in groups from one year to the next is not as great as within a specific school; nonetheless it still seems that analysis of year level trend data is not as meaningful as comparing the achievements of the same group of students over time, so let’s do just that.

In 2009, 93.7 per cent of Year 3 students nationally achieved the minimum benchmark for Reading. In 2011, the figure for Year 5, the year 3s of 2009, is 91.5 per cent, representing a drop of over two per cent. This result is not unique to Reading; a comparison of Year 3 2009 results with Year 5 2011 results shows a decrease in the percentage of students achieving the national minimum standard in all areas except Numeracy, which shows a modest gain of 0.4 per cent. The most dramatic drops in Reading are seen in the Northern Territory (5.9 per cent) and South Australia (3.6 per cent), but the trend is evident right across the board. The Year 7 to Year 9 comparison data at a national level shows an even more marked decline over the two-year period, with lower percentages of students reaching the minimum standard in every area.

Against the trend, a greater proportion of the Year 5 students of 2009 achieved the national minimum standard in all categories except Writing as Year 7s in 2011. It is an interesting phenomenon that replicates that pattern in 2010, where the Year 5s of 2008 showed improvement almost unilaterally by the time they reached Year 7, but a smaller proportion of the Year 3 and Year 7 students from 2008 met the minimum benchmarks in 2010 as Year 5 and Year 9 students in almost every category. There are many possible reasons for this. Is the test content better calibrated to some year levels than others? Are the standards appropriate for each NAPLAN testing year? Is there a natural slowing of learning at some stages in a child’s development? It is easy to speculate, but much harder to arrive at any definitive conclusions.

The introduction of persuasive writing to replace narrative in the NAPLAN Writing assessment has brought with it a whole new set of challenges. The Australian Curriculum, Assessment and Reporting Authority’s (ACARA) National Assessment Program website makes it clear that writing results for 2011 are not comparable with previous years. “Because there is a difference in the way the narrative and persuasive tasks function across year levels, a new and additional NAPLAN scale specifically for persuasive writing has been developed. As this is a different scale, persuasive writing scores should not be compared with narrative writing scores from previous years.” Even so, the fact that only 84.6 per cent of Year 9 students reached the national minimum standard is cause for alarm. It is interesting to note, however, that 95 per cent of Year 3 students and 92.1 per cent of Year 5 students did achieve the minimum benchmark for their year levels. Many teachers were concerned about the ability of younger students to manage the persuasive writing genre, and in light of the fact that older students have more experience with nonfiction writing, it is surprising that the results of the Year 3 and 5 students were so much more positive. ACARA does point out that these results are preliminary, and perhaps some tweaking of the benchmarks therefore still needs to occur.

Since its beginnings in 2008, NAPLAN’s political profile has grown at a rate that any aspiring prime minister would be envious of. Yet like a politician, NAPLAN seems to be saying different things, depending on whom you talk to. As a counterpoint to Mr Garrett’s “Nine in ten Aussie kids meet national literacy and numeracy standards”, the Shadow Minister for Education, Christopher Pyne, declares in a press release, “After four years of Labor shouting from the roof tops that they would revolutionise education, the NAPLAN results show the basic literacy and numeracy of the Australian school students tested has gone backwards in many areas.” The two statements are perhaps not mutually exclusive, but certainly represent two different agendas.

State and territory perspectives are also varied. South Australia’s Education Minister Jay Weatherill’s NAPLAN news release talks of the state’s results as being, “relatively consistent with previous years,” and says of SA students, “in most measures they are performing at a similar level to their interstate peers.” However, Mr Weatherill concedes that the news is not all good and instead focuses on the measures being put in place to lift student performance. Most notably: “Mandatory minimum teaching times in maths, science and literacy also started in South Australia this year, with Year 4 to 7 students now spending a minimum of two hours a week on science, five hours on maths and five hours on literacy. Year 3 students are spending at least 90 minutes on science, five hours on maths and five hours on literacy.’

The ACT’s Minister for Education and Training Andrew Barr is conspicuously silent on NAPLAN results for a territory that performs so well against the rest of the country. Not so reticent is Victoria’s Minister for Education Martin Dixon, whose press release declares, “Victorian students top the nation in literacy and numeracy tests,” a bold claim from a newly elected representative considering Victoria, while in the top three for most categories, bested the ACT and NSW only in Year 5 Numeracy and Year 7 Writing. 

The recently elected New South Wales state government’s National literacy and numeracy results released media statement was much more low-key, perhaps from a reluctance to give too much credit to the previous government for NSW’s ranking in the top three jurisdictions in all NAPLAN testing areas. Minister for Education Adrian Piccoli instead drew attention to the current government’s Literacy and Numeracy Action Plan, “including the injection of an additional 900 teachers over five years to boost literacy and numeracy.”

In Tasmania, where the percentage of students attaining the national minimum standard grew in seven categories, but decreased in a further nine, Minister for Education and Skills, Nick McKim is quick to point out the complexities involved in interpreting NAPLAN data. “If there was a magic wand to improve these figures, it would have been waved years ago,” he explains in a media release. He highlights the move to a four term year in 2013, extra funding for early years programs and continued funding for the Raising the Bar Closing the Gap program as key strategies to improve student outcomes in the future.

Just how much NAPLAN is shaping our educational direction is a question of concern to many. A current Queensland University of Technology (QUT) study is examining the effect of the testing on teachers and the curriculum. Though continuing over three years the study, led by QUT’s Prof Barbara Comber, has already found that NAPLAN is having a negative impact on teacher morale, especially in terms of a perceived reduction in autonomy. There are also early indications of a narrowing of the curriculum by teachers in the early part of the year, right up to test time in May, at the expense of areas such as science and the arts. Astonishingly, ACARA appears to share this view, asserting via the National Assessment Program website that, “The new NAPLAN Writing genre was introduced to avoid a narrowing of the curriculum through a disproportionate focus on writing narratives at the expense of other genres.” With the publication of schools’ data on My School, the potential for teacher bonuses to be tied to NAPLAN results and an increasing tendency to use NAPLAN for political leverage, one can reasonably speculate that this is unlikely to improve in the foreseeable future.

Yet despite this, the critics of NAPLAN appear to far outnumber the supporters. The vocal anti-NAPLAN crowd raises objections based on a plethora of reasons and motives. Independent Schools Queensland Executive Director David Robertson contends in the organisation’s NAPLAN media release: “Test results will never paint a complete picture of the standard of education and the quality of outcomes being achieved by children in independent schools across the entire state.” 

Melbourne Graduate School of Education academic Margaret Wu argues: “Teachers and parents should be aware that a student’s NAPLAN score on a test could fluctuate by about ±12% ...remember that NAPLAN results are based on just one single test of limited test length. A sample of 40 questions is not sufficient to establish, with confidence, the exact numeracy proficiency of a student. The same caution applies to all subject areas tested.” 

The NSW Minister for Education, Adrian Piccoli laments, “I remain concerned that while NSW has retained its excellent NAPLAN participation rates, other states and territories are not participating at the same level. This skewed participation affects the mean score comparisons between states, as any student who is exempted is not included when the mean results are calculated.” And researchers Elizabeth Grant and Fiona Mueller criticise the quality of the tests themselves, as well as issues such as “the longstanding absence of a nationally agreed approach to teaching grammar and punctuation.”

The team in support of NAPLAN, however, seems the stronger one at the moment. Imperfect though it may be, NAPLAN, and the manipulation of its results to suit particular needs, is here to stay. The truth is that we do need some sort of measure by which to assess the success of our education system, but while the AFL and rugby grand final winners leave the season victorious, education is not a game, and there are no winners and losers ...are there?

References

ACARA (2011) NAPLAN Summary Report 2011. Retrieved 21 September 2011 from

http://www.nap.edu.au/_Documents/National%20Report/2011_NAPLAN_Summary_Report.pdf

Garrett, Peter (2011) Nine in ten Aussie kids meet national literacy and numeracy standards. Retrieved 20 September 2011 from:

http://www.deewr.gov.au/Ministers/Garrett/Media/Releases/Pages/Article_110909_114913.aspx.

Grant, Elizabeth and Mueller, Fiona (2010) NAPLAN fails test Retrieved 24 September 2011 from:

http://www.onlineopinion.com.au/view.asp?article=10784.

Independent Schools Queensland (2011) NAPLAN Results Show Queensland Students Have Once Again Improved Significantly Retrieved 24 September 2011 from:

http://www.aisq.qld.edu.au/files/files/Communications/media_releases/Media%20Release%20-%20NAPLAN%20Results%20Improve%20Significantly.pdf.

McKim, Nick (2011) Literacy and Numeracy Report. Retrieved 23 September 2011 from:

http://www.education.tas.gov.au/dept/news/latest/literacy-and-numeracy-report.

Piccoli, Adrian (2011) National literacy and numeracy results released  Retrieved 23 September 2011 from:

https://www.det.nsw.edu.au/about-us/news-at-det/media-centre/media-releases/national-literacy-and-numeracy-results-released

Pyne, Christopher (2011) NAPLAN reveals Revolution faltering. Retrieved 23 September from:

http://www.liberal.org.au/Latest-News/2011/09/09/NAPLAN-reveals-Revolution-faltering.aspx.

Queensland University of Technology (2011) Education expert finds NAPLAN affecting learning and teacher morale. Retrieved 23 September 2011 from:

http://www.qut.edu.au/about/news/news?news-id=37098.

Weatherill, Jay (2011) NAPLAN 2011. Results underline importance of education reforms Retrieved 23 September 2011 from:

http://www.premier.sa.gov.au/images/stories/mediareleasesSEP11/naplan.pdf.

Wu, Margaret (2009) Interpreting NAPLAN Results for the Layperson Retrieved 24 September from:

http://www.appa.asn.au/images/news/naplanforlayperson20091022.pdf.