It’s official; our education system has become well and truly politicised. It snuck up on us gradually. First we had state based achievement tests to monitor our progress. Next, the tests became national so each state and territory could be assessed against common criteria. All this seemed harmless, helpful even, until the final indignity was forced upon us; the publication of these results in a way that allows comparison of schools, with little contextual information.

Maybe it was all part of the government’s economic stimulus plan, because it has certainly spawned a whole industry of test-result enhancing companies. In the last week, I have received four brochures and one email – not to mention having my browsing of the education section of a leading city newspaper interrupted by an advertisement – all promising to improve the National Assessment Program – Literacy And Numeracy (NAPLAN) results of my school and/or child.

Within schools, there is a growing obsession with NAPLAN results, fuelled not by concerns about pedagogy and helping children reach their potential, but by how it will look on My School. At a recent maths network meeting, I was dismayed, but not surprised, to hear one teacher comment that although she knows NAPLAN is only one test on one day in the year, it’s still what her school is being judged on, and therefore time needs to be dedicated to improving scores even though she knows this is not the right approach to education.

In fact, the “right approach” to education may be changing, and if we become too heavily mired in test-driven competition, we might well miss an opportunity to develop world leading educational approaches for the 21st century.

One person who subscribes to this view is Mark Treadwell, an independent education consultant with a particular interest in using our knowledge of brain processes to find optimal teaching and learning practices. According to Treadwell, we are at a unique point in history. Just as the printing press revolutionised education in the 1400s, the internet has changed the way we communicate and access information. “Lifelong learning is now critical because the amount of knowledge and new understanding that is being discovered is growing exponentially. We cannot possibly know what students need to have knowledge around, understand, be able to do and be able to apply their knowledge and understanding to in five years time, let alone 20 years time,” he explains.

This perspective represents what Treadwell terms a “paradigm shift” for education. Teachers are no longer the keepers of the knowledge, since knowledge is now freely available on the internet. Instead, they must become facilitators of learning, guiding students to develop the skills and understandings that will be necessary for successful citizens in the modern world.

Which brings me back to the testing issue. Used diagnostically, tests like those taken as part of the NAPLAN can help teachers identify fundamental gaps in knowledge for individual students or for a class as a whole. The NAPLAN writing criteria, for example, form a well thought out tool that can be used to inform teaching because it is a real and accurate reflection of the complex skills that help writers to effectively convey messages. Analysing the results can lead teachers to provide engaging learning opportunities that encourage students to explore what constitutes high quality writing, with a view to improving the content and structure of their own work.

Publishing test results in a way that allows the media, the government, or anyone else who might have an interest to compare and rank schools is, however, an entirely different proposition. Rob Vingerhoets, an experienced primary school teacher, principal and curriculum coordinator who is now in high demand as a consultant working with schools to improve their teaching in the area of maths, has strong views on this subject. “To either praise or condemn schools, teachers and students on the strength of a one-off, what you see that day is only what you get that day type test, is simply a very risky thing to do,” he asserts.

Vingerhoets suggests that NAPLAN-type annual assessment should contribute only about 10 per cent to overall student evaluation. “Regular, ongoing type assessments should form about 65 percent of the total assessment picture, while periodic assessments (for example, at the end of each term) should form about 25 per cent,” he suggests. This kind of information obviously gives a more dynamic and balanced reflection of student achievement, which is perhaps not nearly as adaptable to publication and comparison, but is certainly both valuable and accurate.

Any number of factors can influence a school’s NAPLAN scores in a given year. Not least of these is the cohort of children, who might be exceedingly bright or unfortunately slow. Currently there is – and arguably should be – a natural variance in the year-to-year results of a school because each year there is a different group of children being tested. But the red-green colour coding of the My School website has neither sympathy for, nor understanding of, children as individuals.

Students sit their first NAPLAN tests in Year 3, when typically they are just eight years old. The reputation of a school is a heavy burden to place on such young shoulders. With increasingly high stakes associated with NAPLAN results, a damnable by-product of “transparency” in education has been the erosion of confidence in our teaching methods.

 In his work with schools in the state, Catholic and independent sectors, Rob Vingerhoets says he has seen “too many teachers being hit over the head with the club called NAPLAN results to know that data driven, respond directly to the analysis of test scores teaching is not the way to go.” With his wealth of experience including a stint assisting teachers and students in New York City, Vingerhoets knows that good teaching is essential if the deep learning that our national curriculum aspires to is to occur. “I firmly believe that if we get classroom practice right the data or results will look after themselves,” Vingerhoets emphasises.

It is certainly true, in theory, that if we are drawing on sound pedagogy and providing rich learning opportunities for our students, good test results should logically follow. In practice, the hype about My School has forced many education providers into a state of panic that has resulted in a narrowing of the curriculum to maximise scores, rather than learning.

Mark Treadwell cites an interesting hypothesis first put forward by American Robert K Branson. Branson’s research suggests that we have reached the upper limit of performance based on book technology, evidenced by the fact that overall test scores around the world have been on a plateau since the early 1960s. In other words, we have attained the maximum efficiency possible using the knowledge-based style of teaching and learning that has been our mainstay from the time that books were invented.

Treadwell contends that part of the reason for this is the way our brains are structured. “Only seven per cent of the cells in your brain manage rote learning (neurons) and 76 per cent of them manage the process of forming concepts (astrocytes). Education systems have had to deal with rote learning as it underpins much of the emergent reading and writing process and there is no way around this. However, curriculum in general should be focusing on leveraging the far more efficient concept learning system within our brain...”

The implications of recent research into the brain’s capacity to conceptualise rather than memorise are huge for education. Tests are often an especially limited form of assessment that rely largely on rote learning. In building a culture that values test results over true understanding and the ability to apply learning in different contexts, we are ignoring not only a whole range of other attributes that are important in life outside school, but the very way our brains are wired to operate.

Mark Treadwell believes that learning should be centred on inquiry, which develops key concepts that students will need to be successful lifelong learners. “Inquiry learning is about working as a team to build a knowledge base, to research and apply that knowledge and in the process build understanding and then apply that understanding to a range of different contexts in order to build a conceptual understanding of the big ideas that underpin that learning.”

Inquiry learning is not a new idea. Most Victorian primary schools are working to develop or refine an inquiry approach that addresses important issues such as sustainability, global citizenship and connectedness. But to facilitate learning experiences that truly reflect inquiry practices takes time, a commodity that is in short supply when improving test results is driving the curriculum. We seem to be at a point where two of the main agendas in education – engagement and accountability – are at odds with each other.

If you have ever watched Rob Vingerhoets teach, you will know that engagement is one of the keys to his success. “We can’t just keep teaching kids the way we have for the last number of decades. We have an obligation to engage them in their learning – this means effective use of technology and more importantly, involving them in work that is meaningful, relevant, challenging and enjoyable, not just busy.”

The federal government is certainly not insensible to the changing needs of today’s learners. Their commitment to build and operate the National Broadband Network as part of the Digital Education Revolution programme, at a cost of $100 million, shows how seriously they are taking the challenge of capitalising on technology. Indeed, the Department of Education, Employment and Workplace Relations website proclaims that “High speed broadband is the foundation on which information technology can be integrated into our schools, making a new approach to learning and teaching possible.”

Despite the impressive investment in technology infrastructure, the push for accountability through test results is endangering the development of dynamic pedagogy that is responsive to student needs and not to increasing scores. According to Mark Treadwell, “The difficulty with accountability in education is that no two children have the same capacity for learning, nor do they have the same capacity for learning in the same areas. Benchmarking schools against each other via league tables would only work if every student had the same learning potential.”

In its present form, the My School website does just that. The score allocated to a school in each of the measured areas does not reflect the diverse personalities, backgrounds and talents of the students, nor does it reward innovation and creativity in teaching and learning practices. Worst of all, it doesn’t even meaningfully compare the progress of individual students.

This year will be the first time that longitudinal NAPLAN data will be available on roughly the same cohort of children – the Year 3s of 2008 (the first year of NAPLAN) are now in Year 5, and the Year 7s of 2008 are now in Year 9. Comparing the 2008 and 2010 data might give some insight into the value a school has added for a particular year group of students; however, even this will not be a reliable indicator. Students come and go from schools, and the smaller the school, the greater the impact this has on average results. Throw in the additional complication of selective schools (particularly in NSW, children often move to feeder schools for selective high schools in Year 5, which improves the results of the schools the brighter students swap to, with the opposite effect on the schools they leave) and the significance of comparative data becomes questionable.

It will also be difficult to know what represents acceptable improvement when comparing 2008 Year 3s with 2010 Year 5s. While the distinctive red-green indicators of My School will give an idea as to whether or not a school has improved against its similar school grouping, the aggregated scores reveal little, since the expectations and hence the final scores increase as a child progresses through school.

Yet accountability and sound teaching practice do not have to be incompatible. Both Mark Treadwell and Rob Vingerhoets are firm supporters of the notion of accountability but suggest alternatives to test scores. “As soon as a kid walks through my classroom door I feel accountable for the social, emotional and academic welfare of that child,” Vingerhoets stresses. His maths planning blueprints include setting pre-unit tasks to effectively assess students’ needs before a topic is started. “Use mixed ability teaching for the bulk of the unit, look for links and connections in the maths teaching, facilitate the learning in the classroom and you’re on your way,” he advises.

According to Treadwell, “Most of the testing regimes have been summative and consequently punitive.” He advocates assessment that is a balance of summative, diagnostic, formative and self reflective, with the information merged, and presented electronically to parents on an ongoing basis. “Learning management systems should be able to take existing data collection processes and present this to parents/caregivers in a format that allows them to be fully aware of the added value that the school, in conjunction with the parents/caregivers and the learner, are working towards.”

Many schools have developed their own form of digital or paper portfolios as a record of student learning. This type of assessment is valuable as it chronicles each student’s learning journey, comparing a child’s results only with where he or she started. Communicating this on a large scale to reflect the efficiency of a school would, however, be next to impossible.

So we are stuck, for now, with a system of comparing schools that is hopelessly inadequate in the face of the complex business of education. Schools who choose to play along and expend time and energy on raising NAPLAN scores will perhaps look better on My School, but at what cost? With the government’s publically popular “back-to-basics” battle cry in the background, we are heading towards dangerous territory.

Will you participate in the political publicity game, or trust that, as Rob Vingerhoets suggests, “The best way to improve test scores is to provide engaging lessons...?”