
Artificial Intelligence has moved faster than education policy, faster than curriculum reform, and faster than many schools ever anticipated. For many educators, AI arrived in classrooms not as a carefully planned innovation, but as an urgent disruption - raising immediate questions about plagiarism, assessment validity, authenticity, and the future role of teachers.
At St Peter Claver College, we decided early that pretending AI did not exist was not a viable option. Instead, we chose to engage with it deliberately, ethically, and transparently. Since Term 4, 2024, our Year 8-12 students have been part of a Copilot 13+ trial, with Microsoft 365 Copilot deployed to staff in early 2025. What we have learned since then has fundamentally reshaped the way we think about assessment, feedback, differentiation, and student agency.
At the National Education Summit Australia - AI in the Classroom Conference in Brisbane, I will be sharing what this journey has actually looked like in practice - not the marketing version, not the hypothetical future, but the realities of empowering classrooms with Copilot today.
Our Context: Real Students, Real Constraints, Real Decisions
St Peter Claver College is a Brisbane Catholic Education secondary school (Years 7-12) with over 1,120 students, an ICSEA of 1011, and a 1:1 College-managed laptop program operating fully within a Windows and Microsoft environment.
Like many schools, we were already deeply invested in teaching quality, assessment integrity, and student wellbeing when generative AI tools became mainstream. The arrival of Copilot forced us to confront some hard truths:
• AI was already being used by students - just not ethically or transparently
• Existing assessment tasks varied widely in their susceptibility to misuse
• Staff confidence and understanding were highly uneven
• Simply “banning AI” was neither realistic nor educationally sound
Rather than asking “How do we stop this?”, we reframed the question to “How do we teach students to use AI well, responsibly, and in ways that genuinely enhance learning?”
A Necessary Shift: From Policing to Purpose
The introduction of Copilot required what I can only describe as an almost complete u-turn in our language and approach to AI.
Before rollout, staff concerns were familiar and understandable:
• How do we catch students using AI to write entire assignments?
• What does authentic work even look like now?
• Can I turn this off in my classroom?
• How do I monitor AI use?
These concerns were not dismissed - they were addressed head-on.
Before any student access, we undertook:
• A systematic audit of Year 7-10 assessment tasks to identify AI vulnerability
• Exploration of the Perkins, Furze, Roe & MacVaugh (2024) AI Assessment Scale
• Surveys and informal polling to understand actual student behaviour
• Close collaboration with Brisbane Catholic Education ICT and Learning Technologies staff
• Clear communication with parents focused on privacy, safeguarding, and ethical use
Most importantly, we recognised that assessment design, not detection software, would be the decisive factor.
Bringing Staff With Us - Not Dragging Them
Our staff launch was deliberately framed around permission, professionalism, and trust.
We:
• Debunked common myths about AI
• Clearly articulated what was and was not acceptable
• Demonstrated AI limitations and inaccuracies - live
• Explicitly gave staff permission to trial, fail, and learn alongside students
• Challenged avoidance, while protecting professional autonomy
Curriculum Leaders were positioned as lead agents of change, with dedicated time to:
• Explore Copilot classroom use
• Rethink assessment design
• Build confidence before leading departmental conversations
• Share emerging quality practice across faculties
This approach mattered. AI adoption is not a technical problem - it is a cultural one.
Copilot in Practice: What Students and Teachers Are Actually Doing
Today, Copilot is embedded across teaching and learning in ways that prioritise thinking, not shortcuts.
Some examples from our classrooms include:
• Religion & Ethics students using Copilot’s Visual Creator to support sacred storytelling tasks
• Students across subjects using Copilot for feedback against ISMGs and standards elaborations
• Science students interrogating the limitations of AI-generated responses, rather than accepting them
• Students breaking complex concepts into manageable chunks to support understanding
• Copilot as an exam revision and content mastery assistant
• Teachers using M365 Copilot to:
o Align resources to Australian Curriculum and QCAA objectives
o Analyse student data to identify learning trends
o Reduce administrative load and reclaim time for pedagogy
Crucially, Copilot is positioned as a thinking partner, not a replacement for thinking.
The Reality Check: What Hasn’t Magically Disappeared
We are honest about the challenges.
Yes:
• Some students still use other AI tools outside our environment
• Some under-13 students access AI elsewhere
• Some students still attempt to submit AI-generated work
• We do not catch every instance
But what has changed is the nature of the conversation. When issues arise, they are now framed around learning, ethics, and responsibility, not just compliance and punishment.
What We’ve Learned So Far
After more than a year of intentional work, several insights are clear:
• Promoting ethical AI use did not open the floodgates
• There has been no noticeable rise in plagiarism penalties
• Students remain cautious - sometimes overly cautious - even when AI use is approved
• Using Copilot for ideation and feedback is leading to higher-quality student responses
• There is strong evidence of increased metacognition
• Copilot is a powerful differentiation tool
• Assessment design and cycle review remain the most effective safeguards
AI did not weaken learning - poor task design does.
Rethinking Assessment in an AI-Equipped Classroom
One of the most significant shifts has been our move towards a more evolving assessment model, including:
• A two-lane approach (assessment for learning and of learning)
• Increased use of portfolio and contextualised assessment
• On-demand writing tasks
• Response authoring over multiple lessons
• Use of Teams assignment templates to support version tracking
Early success in Science, HPE and Media suggests this approach is not only viable, but desirable.
Why This Session Matters
My session, Empowering Classrooms with Copilot 13+, is not about selling a tool or promising silver bullets.
It is about:
• What actually changes when AI is introduced thoughtfully
• How to shift staff culture without losing trust
• What ethical AI use looks like in real classrooms
• How assessment can evolve without sacrificing rigour
• How teachers remain central - not sidelined - in an AI-enabled future
If you are grappling with similar questions, tensions, or uncertainties, I invite you to join the conversation in Brisbane.
Because while AI may be changing education rapidly, teachers still teach, motivate, and inspire in ways AI never will. What Copilot can do is act as the interconnector - between knowledge, understanding, and human expertise - enabling us to deliver more personalised, thoughtful, and impactful learning for every student.
I look forward to sharing the journey - and the lessons - with you.
Shane is speaking at the Brisbane AI in the Classroom Conference on ‘Empowering Classrooms with Copilot 13+: Leading AI Integration at Claver‘. View the Full Conference Program Here.