The Complete Guide to Using AI in the Education Industry in New Zealand in 2025
Last Updated: September 13th 2025
Too Long; Didn't Read:
AI is widely used in New Zealand education - 69% of teachers use it weekly - but GenAI is prohibited in NCEA external assessment. Automated scoring marked over 55,000 responses (May 2025), returning results 3.5 weeks earlier. Schools need professional learning, clear policies and Māori data‑sovereignty safeguards.
AI is no longer a distant policy question in Aotearoa - it's in lesson planning, marking conversations and students' pockets: recent analysis reports 69% of New Zealand teachers use AI weekly while many students use tools with little guidance, so practical action is vital.
The Ministry of Education's Ministry of Education generative AI guidance for New Zealand schools urges schools to check AI outputs, protect personal data and make clear policies (noting GenAI is not permitted in NCEA external assessment), while EdTechNZ's review argues the sector still needs tailored governance and teacher support - especially where Māori data and cultural integrity matter (EdTechNZ analysis on AI governance in New Zealand education).
For school leaders and teachers after hands-on skills, Nucamp's AI Essentials for Work bootcamp (15-week) registration and details offers prompt-writing and practical AI use across workplace functions in a 15-week programme to build confidence and classroom-ready practice.
| Attribute | Information |
|---|---|
| Description | Gain practical AI skills for any workplace; learn AI tools, prompts and apply AI across business functions |
| Length | 15 Weeks |
| Courses included | AI at Work: Foundations; Writing AI Prompts; Job Based Practical AI Skills |
| Cost | $3,582 (early bird); $3,942 afterwards - paid in 18 monthly payments, first due at registration |
| Syllabus | AI Essentials for Work bootcamp syllabus |
| Registration | Register for the AI Essentials for Work bootcamp |
Table of Contents
- What is Generative AI and LLMs? A beginner's explainer for New Zealand educators
- Practical classroom use-cases in New Zealand: tutoring, feedback and lesson planning
- Assessment, academic integrity and NCEA: New Zealand guidance and classroom practice
- Building an AI policy for your New Zealand school: governance and privacy
- AI literacy and professional learning for New Zealand educators
- Risks, equity and cultural considerations in New Zealand - Māori data sovereignty and bias
- Tools, pilots and local innovation in New Zealand education
- Practical checklist and NZ-ready sample prompts for teachers
- Conclusion: Next steps for New Zealand schools, leaders and teachers
- Frequently Asked Questions
Check out next:
Find your path in AI-powered productivity with courses offered by Nucamp in New Zealand.
What is Generative AI and LLMs? A beginner's explainer for New Zealand educators
(Up)Generative AI - and the large language models (LLMs) that power many chatbots and text generators - are best thought of as super‑powered “autocomplete”: systems trained on huge datasets to produce new text, images, audio or video from a prompt, able to speed up lesson planning, draft feedback and scaffold ideas but also able to “hallucinate” plausible‑sounding errors; in classroom terms it's like a 24/7 study buddy that can confidently tell tall tales unless checked.
New Zealand guidance is clear: treat these tools as aids, not arbiters - always review outputs, avoid entering personal or sensitive data, and recognise cultural and language gaps (models are often weak on Mātauranga and Te Reo Māori).
Practical starting points for teachers are available from the Ministry of Education's Ministry of Education generative AI guidance for schools and Ako Aotearoa's Ako Aotearoa beginner's guide to AI literacy for educators, which explain what these tools do, classroom risks (privacy, bias, age limits) and why the teacher's professional judgement remains central - especially because NZQA and the Ministry emphasise that GenAI has limits for assessed work.
| Term | Quick definition / classroom note |
|---|---|
| Generative AI | Creates new content from prompts; useful for ideas and drafts but can be inaccurate. |
| LLMs (e.g., ChatGPT) | Large models that generate text like an advanced autocomplete; verify facts and watch for bias. |
| Key NZ guidance | Check outputs, avoid personal data, and note GenAI is not permitted in NCEA external assessment. |
'Generative artificial intelligence' is a term that covers a range of tools that have been trained using huge sets of data to create new content.
Practical classroom use-cases in New Zealand: tutoring, feedback and lesson planning
(Up)AI is already showing up in everyday Kiwi classrooms as a practical toolkit for tutoring, feedback and lesson planning: teachers use chat-based tools to kickstart ideas, build tailored units and even generate near‑instant formative feedback so students don't have to wait until the weekend for marking - one Wellington teacher developed a system that grades NCEA‑style answers against rubrics and directs students to improvements, freeing up teacher time for higher‑value coaching (see the RNZ report on AI use in NZ schools).
Used well, AI can act as a 24/7 study buddy, an “ultimate feedback machine” for quick drafts, or a team‑charter coach that helps groups clarify roles and goals; practical, classroom‑ready prompts and step‑by‑step use cases are laid out in the AI for Education guide, which offers shareable prompts teachers can trial with students.
Cautionary guardrails matter: keep a human‑in‑the‑loop to check hallucinations, respect privacy and cultural integrity, and remember GenAI isn't allowed in NCEA external assessment - the goal is smarter, equitable teaching supported by AI, not replacement of the teacher.
"Whenever I find myself stuck on an assignment and I really can't get my brain to think of what to write or where to start I just put in a prompt and spark some ideas,"
Assessment, academic integrity and NCEA: New Zealand guidance and classroom practice
(Up)Assessment and academic integrity remain front and centre as schools and kura adapt to GenAI: NZQA and the Ministry of Education both stress that any school with consent to assess must include the acceptable use of AI in an authenticity policy, and that GenAI is explicitly not permitted in NCEA external assessment, so clear local rules and student education are essential - see NZQA's guidance on the acceptable use of Artificial Intelligence and the Ministry's Generative AI guidance for schools.
Practical classroom strategies recommended by NZQA include milestone check‑ins, observed progress, source acknowledgement, follow‑up questioning and careful use of detectors (with caution about false positives), all designed to ensure assessment evidence is genuinely a student's own work.
At the same time NZQA is trialling responsible, human‑centred uses of AI to strengthen assessment practice: its August 2025 report on Embracing AI in student assessments describes an automated text scoring tool that marked over 55,000 Year 10 writing responses in May 2025 - returning results 3.5 weeks earlier while experienced human markers quality‑assured more than a third of those scores - illustrating how AI can speed up feedback without replacing teacher judgement.
For schools, the “so what?” is simple: combine robust policy, student education about academic integrity, and human‑in‑the‑loop checks so AI supports assessment quality rather than undermining it.
| Metric | Value / note |
|---|---|
| May 2025 writing assessments | Over 55,000 responses marked with automated text scoring |
| Time to return results | 3.5 weeks earlier than previous year |
| 2024 pilot | AI tool tested on 36,000 writing samples (found comparable to human markers) |
| Human quality assurance | Experienced human markers double‑checked over one third of May 2025 results |
| Annual AI interactions | Over 250,000 student and customer interactions with NZQA's AI tool each year |
| NCEA external assessment | Use of GenAI is not permitted |
Building an AI policy for your New Zealand school: governance and privacy
(Up)Building an AI policy for a New Zealand school starts with simple, practical decisions that protect tamariki, staff and the integrity of qualifications: boards are responsible for a clear authenticity policy that specifies the acceptable use of AI (NZQA requires this for schools with consent to assess), the Ministry of Education urges schools to “talk about it and make a policy” that covers purpose, scope, data privacy and professional learning, and Ako Aotearoa recommends aligning strategy with Te Tiriti and local values so governance is culturally safe.
Practical must‑haves include explicit rules on GenAI in assessment (GenAI is not permitted in NCEA external assessment), data‑minimisation and warnings against entering personal or sensitive information, plus classroom routines - milestones, observed progress, source acknowledgement and follow‑up questioning - to authenticate student work rather than relying solely on detectors.
Treat the policy as a living document: pair clear rules with staff PLD, procurement checks, and whānau communication so the school can seize AI's educational benefits while keeping privacy, equity and assessment authenticity front and centre.
| Policy element | What to include (source) |
|---|---|
| Authenticity policy | Include acceptable use of AI for assessment; required for schools with consent to assess (NZQA) - NZQA guidance on AI in national assessment |
| Data privacy | Prohibit entering personal/sensitive data; follow Privacy Commissioner expectations and Ministry guidance - Ministry of Education generative AI guidance |
| Assessment integrity | State GenAI limits (not permitted in NCEA externals) and use authenticity checks: milestones, observed progress, follow‑up questions (NZQA) |
| Governance & PLD | Align with Te Tiriti, fund staff capability, create living policy documents and procurement checks (Ako Aotearoa) - Ako Aotearoa “Leading AI responsibly” policy guide for education leaders |
“Whose voice, vision, or values are currently shaping our decisions about AI?”
AI literacy and professional learning for New Zealand educators
(Up)AI literacy and professional learning for New Zealand educators must be scaffolded, practical and culturally aware: the Scaffolded AI Literacy (SAIL) Framework, developed by the University of Canterbury, academyEX and AUT from a Delphi study of 17 experts, offers exactly that - a four‑level ladder (Know and Understand; Use and Apply; Evaluate and Create; +Beyond AI Literacy) mapped across six categories (Impacts; What AI Is and How It Works; Cognitive Skills; Applied Skills; Social, Cultural and Ethical Issues; Risks and Mitigations) so schools can design PLD that moves teachers and students from basic recognition of AI to critical evaluation and creation.
Practical resources from the SAIL project, including an AI Literacy Design Analyser to check course materials against level 1, make classroom rollout manageable for busy kaiako; explore the full framework at the SAIL report and project pages for examples and tools to adapt.
Pairing SAIL with culturally sustaining PLD pathways - alongside existing Ministry‑funded professional development delivered by institutions like the University of Canterbury and community frameworks such as Tapasā - helps ensure AI capability grows in ways that are locally grounded and equitable, turning abstract policy into classroom tasks teachers can trial next term.
| Component | Summary |
|---|---|
| Levels | Know & Understand; Use & Apply; Evaluate & Create; +Beyond AI Literacy |
| Categories | Impacts; What AI Is; Cognitive Skills; Applied Skills; Social/Cultural/Ethical Issues; Risks & Mitigations |
| Developed by | University of Canterbury, academyEX, AUT (Delphi study of 17 experts) |
| Practical tool | AI Literacy Design Analyser to assess course materials (level 1) |
Preparing learners at all levels to engage constructively with Artificial Intelligence.
Risks, equity and cultural considerations in New Zealand - Māori data sovereignty and bias
(Up)Algorithms in Aotearoa risk amplifying historic injustice unless schools, vendors and policy teams treat Māori data as taonga:
researchers warn of “colonising bias” where automated systems reproduce racial, gendered and economic harms, so algorithmic fairness must sit alongside Te Tiriti‑centred governance and meaningful Māori participation; the concept of Māori algorithmic sovereignty (MASov) reframes this by insisting Māori authority (Rangatiratanga), relationships (Whakapapa), reciprocity and redress are embedded across design, data storage and use - including free, prior and informed consent and tikanga‑based protections, not just technical fixes.
Practical steps for education include insisting on onshore storage and co‑designed governance, auditing models for nested bias, and funding Māori capacity to lead AI work; concrete frameworks and principles are laid out in the Māori algorithmic sovereignty paper and related guidance (see the Māori Algorithmic Sovereignty principles paper (Data Science Journal)) and national reviews of Māori data governance.
A vivid wake‑up: government reporting shows NZ agencies use 26 cloud providers (with Microsoft listed by 37 entities and Amazon AWS by 20), a reminder that where data lives is as consequential as how models are built - schools and kura should therefore pair authenticity and equity checks with procurement, PLD and whānau engagement so AI supports learning without eroding cultural integrity (Māori Algorithmic Sovereignty principles paper (Data Science Journal), State of the Nation Māori Data Sovereignty report (Taiuru)).
| Key risk / metric | Evidence from NZ reports |
|---|---|
| Cloud providers used by NZ Government | 26 |
| Microsoft (government entities using) | 37 |
| Amazon AWS (government entities using) | 20 |
| Local government Māori Data definitions implemented (end 2024) | 0 / 78 |
| CRIs with Māori Data definitions (end 2024) | 0 / 7 |
Tools, pilots and local innovation in New Zealand education
(Up)New Zealand classrooms are becoming a lively testbed for tools and pilots that aim to turn AI from a buzzword into practical help: classroom teachers are experimenting with free chatbots (chiefly ChatGPT and Google Gemini) and smaller, locally minded platforms like TeacherGPT that promise NZ curriculum alignment and stronger privacy controls, while research from NZCER maps how teachers use AI mainly for lesson planning, assessment design and personalised materials and urges better funded, privacy‑protected access to premium LLMs (professional learning and prompt-writing PD resources for New Zealand schools and the NZCER study give practical entry points).
Pilots are varied - from school hackathons where students built AI-for-good prototypes (one Westlake Girls' team used AI to speed up disaster coordination) to early marking trials that promise big time savings but also flag reliability and equity challenges - so leaders are rightly cautious about scaling up without robust PLD and governance.
The Ministry's guidance reminds schools to check outputs, avoid feeding personal data into models and embed AI rules in authenticity policies, while sector commentary calls for a national strategy that funds licences, supports teacher capability and keeps cultural integrity front and centre;
The clear “so what?” is this: small, well‑governed pilots plus teacher training can turn scattered experimentation into school‑wide, safe innovation that actually benefits ākonga (New Zealand Ministry of Education generative AI guidance for schools, NZCER generative AI report on teacher use of AI, TeacherGPT practical guide for meeting the Ministry's AI guidelines).
| Measure | Evidence |
|---|---|
| Teachers using AI weekly | 69% (EdTechNZ summary) |
| Teachers without school-funded premium AI | ~75% rely on free tools (NZCER) |
| Teachers wanting more training | 85% (NZCER) |
| Students aware / using AI | ~90% heard of AI; over half have used it (NZCER) |
| Potential grading time reduction | Up to 90% in some AI marking studies (Matadaresearch summary) |
Practical checklist and NZ-ready sample prompts for teachers
(Up)Practical, NZ‑ready checklist: start every AI lesson or assessment with clear rules (is this activity AI‑allowed? note that GenAI is not permitted in NCEA externals and schools with consent to assess must adopt an authenticity policy - see NZQA's guidance), require students to keep prompt logs and a short note on how they edited AI output (Massey's AI use framework asks for records of tools, prompts and revisions), and trial prompts together in class before setting them as homework so everyone learns how to spot hallucinations and bias; a quick, teacher‑friendly prompt to try is role‑setting - “You are my Year 10 economics tutor: explain X, ask me 2 checking questions, then suggest one application task” - which nudges the AI to teach and probe for understanding.
Pair these steps with explicit teaching on privacy (don't enter personal data) and a short in‑class rubric for students to show what they retained after using AI. For practical prompt banks and classroom examples see the Ministry's Generative AI guidance and the curated AI for Education prompt collection for teachers.
| Checklist item | Action | Source |
|---|---|---|
| Authenticity policy | State acceptable AI use for assessments; milestones & observed progress | NZQA guidance on generative AI and authenticity in NCEA assessments |
| Prompt logs & disclosure | Require appendix listing AI tools, prompts and edits | Massey University guidance on AI usage, prompts, and detection |
| Class trial & template prompts | Test prompts in class, refine, then release as homework templates | New Zealand AI for Education practical prompts and classroom examples for teachers |
"I asked ChatGPT to turn my lesson plan bullet points into a first draft - it saved me 30 minutes and gave me a new idea for an activity."
Conclusion: Next steps for New Zealand schools, leaders and teachers
(Up)Now is the moment for schools, leaders and kaiako to move from debate to delivery: update and publish clear authenticity and data‑privacy rules that reflect the Ministry's generative AI guidance, fund targeted professional learning and small, well‑governed pilots that protect Māori data sovereignty, and make human‑in‑the‑loop assessment the default so teachers remain mentors not screen supervisors; practical action can be simple and local - a staged policy review, mandatory prompt‑logs for assessed work, a term‑long pilot with school‑approved tools and whānau consultation - and should be coordinated with national signals such as the Government's AI Strategy and sector calls for tailored guidance.
For leaders who need immediate staff capability, consider short, applied courses that teach prompt design, risk mitigation and classroom-ready workflows (for example Nucamp's Nucamp AI Essentials for Work bootcamp), and lean on sector analysis and resources to design equitable rollouts (see the EdTechNZ analysis of AI in education and the New Zealand Ministry of Education generative AI guidance).
In short: protect learners and cultural integrity, build teacher confidence with funded PLD and prudent pilots, and treat AI policy as a living document so New Zealand classrooms can harness AI's promise without sacrificing authenticity or wellbeing.
| Next step | Action / resource |
|---|---|
| Policy & assessment | Adopt Ministry/NZQA‑aligned authenticity rules and prompt logs - see Ministry guidance: New Zealand Ministry of Education generative AI guidance |
| Professional learning & pilots | Fund short PLD, run small pilots with whānau input - guided by sector reviews such as EdTechNZ analysis of AI in education |
| Practical upskilling | Give teachers hands‑on prompt and workflow training (e.g. Nucamp AI Essentials for Work: Register for the Nucamp AI Essentials for Work bootcamp) |
“If we don't teach children to question AI, they will learn by trial and error.”
Frequently Asked Questions
(Up)What is generative AI / large language models (LLMs) and how should New Zealand teachers use them in the classroom?
Generative AI and LLMs are systems trained on large datasets to produce new text, images, audio or video from a prompt - think of them as super‑powered autocomplete. In classrooms they can speed up lesson planning, draft feedback, scaffold ideas and act as a 24/7 study buddy, but they can also "hallucinate" plausible‑sounding errors and display cultural or factual gaps (often weak on Mātauranga and Te Reo Māori). New Zealand guidance recommends treating these tools as aids, not arbiters: always review outputs, avoid entering personal or sensitive data, include a human in the loop for verification, and embed school rules about acceptable use.
Is generative AI allowed in NCEA assessment and how should schools manage academic integrity?
GenAI is explicitly not permitted in NCEA external assessment. Schools with consent to assess must adopt an authenticity policy that specifies acceptable AI use (NZQA requirement). Practical authenticity measures include milestone check‑ins, observed progress, source acknowledgement, follow‑up questioning and careful, cautious use of AI detectors (they can give false positives). NZQA has also trialled responsible uses of AI: in May 2025 an automated text scoring tool marked over 55,000 Year 10 responses, returning results 3.5 weeks earlier while experienced human markers quality‑assured more than one third of those scores - illustrating how human‑in‑the‑loop systems can speed feedback without replacing teacher judgement.
What governance, privacy and Māori data sovereignty steps should school leaders take when adopting AI?
Boards should publish clear, living AI policies covering purpose, scope, data privacy and professional learning; the Ministry urges schools to "talk about it and make a policy." Key actions: prohibit entering personal/sensitive data, minimise collected data, require procurement checks and vendor commitments on onshore storage where possible, align policy with Te Tiriti and local values, and fund Māori participation and capacity. Consider Māori algorithmic sovereignty principles (Rangatiratanga, Whakapapa, reciprocity, redress). National evidence shows NZ agencies use 26 cloud providers (Microsoft used by 37 entities, Amazon AWS by 20), and by end‑2024 many local and research organisations had not yet implemented Māori data definitions (e.g., 0/78 local govt, 0/7 CRIs), underscoring the need for co‑designed governance and onshore data protections.
What professional learning, tools and statistics should schools consider when planning AI upskilling?
AI literacy should be scaffolded and practical. The SAIL Framework offers four levels (Know & Understand; Use & Apply; Evaluate & Create; +Beyond AI Literacy) across six categories to design PLD. Local pilots and sector resources (Ministry, Ako Aotearoa, NZCER) are useful starters. Current sector metrics: around 69% of NZ teachers use AI weekly, roughly 75% rely on free tools, 85% want more training, and about 90% of students have heard of AI with over half having used it. For hands‑on courses, short applied programmes (for example a 15‑week Nucamp programme) teach prompt design and classroom workflows - typical published course details include: length 15 weeks and example pricing of $3,582 (early bird) or $3,942 thereafter with options like 18 monthly payments and the first payment due at registration - though schools should confirm current fees directly with providers.
What practical classroom checklist and sample practices should teachers use to adopt AI safely?
Start every AI activity by asking: is AI allowed for this task (remember GenAI is not permitted in NCEA externals)? Required routines: 1) publish authenticity rules and notify whānau; 2) require prompt logs and a short appendix noting tools, prompts and edits (Massey and sector frameworks recommend this); 3) trial and refine prompts in class before assigning as homework; 4) use milestone check‑ins, observed progress, source acknowledgement and follow‑up questioning to verify learning; 5) teach privacy rules (do not enter personal/sensitive data) and how to spot hallucinations and bias. A classroom‑ready prompt example: "You are my Year 10 economics tutor: explain X, ask me 2 checking questions, then suggest one application task."
You may be interested in the following topics as well:
Discover the impact of GPU-accelerated research infrastructure on faster model training, unlocking efficiencies for tertiary research and development.
Understand how e-learning content authors can transition from slide factories to designers of authentic, workplace-integrated learning experiences.
See how rapid formative feedback gives students timely, scaffolded edits and revision steps at scale without replacing teacher judgment.
Ludo Fourrage
Founder and CEO
Ludovic (Ludo) Fourrage is an education industry veteran, named in 2017 as a Learning Technology Leader by Training Magazine. Before founding Nucamp, Ludo spent 18 years at Microsoft where he led innovation in the learning space. As the Senior Director of Digital Learning at this same company, Ludo led the development of the first of its kind 'YouTube for the Enterprise'. More recently, he delivered one of the most successful Corporate MOOC programs in partnership with top business schools and consulting organizations, i.e. INSEAD, Wharton, London Business School, and Accenture, to name a few. With the belief that the right education for everyone is an achievable goal, Ludo leads the nucamp team in the quest to make quality education accessible

