Are your students deeply engaged in maths, and how do you know? In education, engagement is a term used to describe how involved students are in an activity. You might think your students are engaged in a maths task when they appear to be busy working and are ‘on-task’. But true engagement is much deeper – it is ‘in task’ behaviour, where students are thinking hard, and seeing connections between the maths concepts and different ways of solving problems across a variety of maths representations and topics.

A maths program developed by the South Australian Department for Education aims to help students do just that. It aims to improve teachers’ pedagogical content knowledge, particularly effective questioning techniques to encourage students to think about their strategies as they progress in solving maths problems. The program involves 30 hours of face-to-face professional learning for Years 6–9 maths teachers over three school terms focusing on an inquiry-based teaching approach to build effective questioning techniques and tasks with multiple entry and exit points to cater to different students’ needs. Teachers then make a commitment to apply program ideas in their maths lessons in between professional learning sessions.

“I tried the perimeter and area of foot task due to the excellent facilitators’ instruction- they made it clear to put the task into practice straight away” Primary school teacher.

On 5 September 2018, Evidence for Learning published the results of the randomised controlled trial of Thinking Maths through its Learning Impact Fund, which identifies, funds and evaluates programs aiming to raise the academic achievement of children in Australia. The evaluation of Thinking Maths was independently conducted by the Australian Council for Educational Research (ACER) during February to October 2017. The trial was conducted with over 158 schools, 7600 students and 300 teachers in South Australia making it one of the largest randomised controlled trials (RCTs) in education in Australia. The trial was testing to find out if students whose teachers received the Thinking Maths professional learning program had increased maths learning compared to a similar comparison group (or control).

Schools are busy places – and evidence can help focus our efforts where it will make the most difference to students. The Learning Impact Fund trials takes the pressure off teachers and leaders and allows them to focus on applying the best evidence to support student learning. This kind of research, when studied together with schools’ local evidence and the wider evidence base, has great capacity to raise the profession’s use of evidence.

What does the evidence say?  
Effect sizes are important to determine the magnitude of impact of an intervention, its statistical significance. Evidence for Learning translates effect sizes into an estimated months’ impact indicator to help teachers understand how many additional months’ progress you can expect students to make as a result of an approach being used in schools. A mixed-method evaluation (including qualitative and quantitative methods), the trial supported a process evaluation to understand program fidelity, cost-effectiveness and conditions for success.

Month’s impact
The study found that Thinking Maths overall, had a small positive impact equivalent to one month’s learning progress on student achievement. This finding however, was not statistically significant, meaning we need to treat it with some caution.

What is interesting from the findings was the differential impact found between primary and secondary schools. Thinking Maths was effective for primary schools, with an impact of two months learning gain for primary students but it was not effective for secondary schools as students had two fewer months of learning progress. Barriers such as timetabled lessons, common tests and less accommodating curriculum structures made it more challenging to implement Thinking Maths in secondary classrooms.

Thinking Maths had the largest effect on maths teachers’ pedagogical knowledge, especially for primary school teachers. Students reported that teachers were more likely to give extra help when needed, asked questions to check understanding and challenged their thinking.

Students were more likely to gain higher maths outcomes in primary classes where the teacher reported strong pedagogical and content knowledge.

The increased pedagogical content knowledge however was not translated to student learning outcomes as a smaller positive impact was found on students’ cognitive engagement, and no impact on students’ maths metacognitive strategies. There was also a small increase in students’ maths anxiety, particularly in secondary students. As this is a teacher professional learning program, the flow on effect to students and the short post-test timeline might take longer to see changes in student outcomes. It was however, promising that there is strong evidence of an impact on primary schools.

The cost of the Thinking Maths program is rated as a low-cost program. It’s estimated at $A149 per student per year (E4L Guidance on Cost Evaluation, Evidence for Learning, 2018). This estimate includes training and materials ($A1070 per teacher or $A43 per student) and the significant cost of five Temporary Relief Teacher (TRT) days replacement ($A2650 per teacher or $160 per student). This estimate is based on training being delivered to an average class size of 25 students.

Evidence security
The Thinking Maths trial is rated with a high rating of four padlocks out of five (1 being the lowest and 5 the highest).

The number of padlocks gives us a sense of how confident we can be in the results (Evidence for Learning, 2018). Each Evidence for Learning trial undergoes rigorous reviews and transparency throughout the course of the evaluation. The one month’s learning progress and four padlock ratings from this trial tell us that the evidence is based on high quality research.

Context – were the schools in the trial similar to my school?
Thinking Maths is developed by the South Australia Department for Education. The project involved 158 schools in South Australia, most of which were located in the metropolitan (63%) and rural (30%) areas. There was an equal distribution of schools in low, mid and high social economic areas.

What should schools consider from here?
Teaching maths has often been presented as a set of fixed rules and relationships to be learned and applied – where there is one answer to every maths question. In inquiry-based learning, just like in the Thinking Maths program, teachers are encouraged to use questions, problems and scenarios to help students learn through asking their own questions and providing their own inquiry to think of different ways to solve maths problems. Schools should consider these questions when reflecting on the results:

Thinking Maths is promising for primary schools, what can my school learn from this?
The program has two months’ learning progress for primary students, which is promising. Primary schools should consider Thinking Maths as a cost-effective program that increases teachers’ maths pedagogical content knowledge.
Inquiry-based learning is often very challenging, and places new demands on teachers to shift mindsets and classroom cultures to support a more student-centred approach. Teachers often require a wider system of practices including designing quality tasks with multiple points of entry and exits, and the skills to ask the right questions which require a high level of content knowledge from teachers.

Schools considering getting or using the program should consider three things:

  • Is my school’s curriculum structure supportive of this strategy?
  • How will teachers be supported when they implement the program in-between and after Thinking Maths professional learning sessions?
  • How in-school evaluation assess overall success?

Taking the approach into the classroom requires deep thinking about how the program connects to students’ outcomes. When implementing Thinking Maths’ strategies, schools should consider how teachers collaborate to effectively plan lessons to develop students’ metacognition (awareness and understanding of one’s own thought processes), one of the strategies used to build teachers’ and students’ effective questioning skills. The research has suggested three ways to help students move towards a solution to their maths problem: Plan a strategy to undertake a task (How have I done these problems before?)

Monitor the use of the identified strategy to check progress (Did this strategy help me understand the task?)
Evaluate the overall success (I need to think about how we have done these before and choose the best strategy).

“Questioning more often and not jumping in so quickly to help students out. The facilitators have helped me focus on student aversion to problem solving and the need to work on questioning and being ‘less helpful’” Primary school teacher.

What do the findings mean for secondary schools?
One of the reasons for the difficulty in implementing Thinking Maths was that secondary classroom structures are often less accommodating and tests-based. As compared to the primary classrooms, secondary classrooms tend to encourage more student independence, and often adopt more direct teaching of maths concepts to complete demanding maths concepts. The evaluation found the effects of the intervention narrowed as students progressed in the secondary school years from Year 8–9, with Year 9 students having the smallest post attainment maths outcomes as a result of the intervention.

Although the program was not effective for secondary schools in its current form, this does not mean that the teaching strategies Thinking Maths promote are not beneficial for learning. Both primary and secondary teachers reported increased pedagogical content knowledge skills, with some secondary teachers highlighting the need for more time to plan and implement for their contexts.

“The way I now pitch lessons to create more wonder and thinking by students rather than the teacher doing all the thinking” Secondary teacher.

Secondary schools considering implementing Thinking Maths strategies should think about how Thinking Maths strategies would be useful in teaching maths, and how their school structures could support the process.

In a recent meta-analysis, Lazonder and Harmsen (2016) synthesised 72 studies to compare the effectiveness of different types of inquiry-based guidance for different age categories. The findings found that the largest overall effects on the guidance were on learning activities (d = 0.66, 95% CI [0.44, 0.88]), performance success (d = 0.71, 95% CI [0.52, 0.90]), and learning outcomes (d = 0.50, 95% CI [0.37, 0.62]). What this research tells us is that inquiry-based learning works best when teachers provide appropriate tools and guidance to support students within a framework to solve a problem.

Rather than focusing on the lack of evidence for secondary schools, schools should be empowered to consider if inquiry-based learning could be an intervention that they might pilot and innovate in their school. Based on this evaluation, we know that school structure, time and curriculum are factors that schools should think about to provide the best conditions for success to implement these teaching strategies.

Essentially, the Learning Impact Fund builds evidence that helps educators in their daily decision-making. Evaluations of this rigour enable our education systems and others, to help teachers have confidence in their teaching strategies and ultimately, make sure all students, make the best possible progress.

High-quality and well conducted RCTs can show if an educational intervention is effective (Hutchison & Styles, 2010). The Thinking Maths findings are the first of three independent evaluations to be released over the next three months.

Evidence for Learning 2018, E4L Guidance on Cost Evaluation, Evidence for Learning, available at
Evidence for Learning 2017, The Learning Impact Fund, available at
Hutchison, D, Styles, B 2010, A guide to running randomised controlled trials for educational researchers, NFER, Slough.
Lazonder, AW, Harmsen, R 2016, ‘Meta-analysis of inquiry-based learning: Effects of guidance’, Review of Educational Research, vol. 86 (3), pp. 681–718.