Back to Blog
teachingdifferentiated instructionwritingAustraliaAI in educationNAPLANteacher workload

Every Student Is at a Different Level. Why Does Writing Class Treat Them the Same?

Australian teachers know every child writes at a different level — but class sizes and time make personalised writing instruction nearly impossible. Here's what the research says, and how AI is starting to change the equation.

kidswriting.ai18 April 2026

Every Student Is at a Different Level. Why Does Writing Class Treat Them the Same?

Walk into any Year 5 classroom in Australia and ask the students to write a persuasive piece on a NAPLAN-style prompt. You'll see something remarkable in its variation. One child writes three confident paragraphs with a clear position and compelling evidence. Another writes four lines. A third writes fluently but jumps between ideas with no structure. A fourth has strong ideas but can't yet control their sentences.

All four are in the same class. All four will receive the same lesson on Monday.

This is the central problem with writing instruction in Australian schools — and it's not the teacher's fault. It's structural.


The Gap Between What Teachers Know and What They Can Do

Every experienced writing teacher knows their students aren't a homogeneous group. They know Sarah needs to work on vocabulary, that Liam has strong ideas but poor text structure, and that Mei's mechanics are excellent but her arguments lack depth.

The research backs up what teachers observe. A 2026 study on differentiated writing instruction found that individual variation in writing proficiency is significant even within a single year group, and that adapting instruction to meet diverse learner needs is essential — not aspirational, but necessary — for genuine improvement.

The problem isn't awareness. It's capacity.

Australian teachers work an average of 44–55 hours per week, well above the OECD average. TALIS 2024 data shows workloads have intensified in recent years, with marking and lesson preparation cited as the biggest time sinks. In that context, asking a teacher to prepare genuinely differentiated writing instruction for 28 students — targeting individual gaps, providing tailored feedback, tracking criterion-level progress — is asking for something that requires more hours than exist in the day.

So teachers do what any reasonable professional under time pressure does: they plan for the middle of the class and hope the extremes catch enough to benefit.


The "Class Lesson" Problem

AERO's School Writing Instruction Framework, released in 2025, represents the state of the art in evidence-based Australian writing instruction. It emphasises explicit teaching, staged scaffolding, quality feedback, and using assessment data to inform targeted teaching.

It is excellent. And largely unimplementable at the individual level in a class of 30.

AERO's framework is designed for whole-school implementation — systematic approaches to lifting writing outcomes across a cohort. What it cannot do, by design, is solve the problem that Liam's weakest criterion is Text Structure while Sarah's is Vocabulary, and that a lesson addressing one does almost nothing for the other.

A February 2025 study examining NSW Stage 1 narrative writing units found that centralised curriculum materials tend toward "strong framing" — uniform delivery that limits opportunities for personalisation. The researchers concluded that without critical adaptation by individual teachers, these materials risk perpetuating writing instruction that is well-designed on average but poorly matched to the specific students in front of you.

The critical adaptation part is exactly what there isn't time for.


What Actually Works — When You Can Do It

The evidence on what makes writing instruction effective at the individual level is clear. Three things matter most:

1. Criterion-level feedback, not general comments

"Good effort" and "work on your structure" are not useful feedback. What moves students is specific, criterion-level information: which aspect of structure is weak, why it's weak, and what to practise next. Research from a 2024 University of Kansas study on AI feedback tools found that students who received immediate, specific feedback on individual writing criteria showed measurable improvements in performance — particularly in organisation and content development.

The problem: writing this kind of feedback for 28 essays takes hours. Teachers often skip it, simplify it, or write it once for the class.

2. Targeted practice matched to the student's specific gap

A student who struggles with sentence control needs different practice than one who struggles with vocabulary. Giving them both the same worksheet — the same lesson — means at best one of them benefits and the other doesn't.

Self-Regulated Strategy Development (SRSD), one of the most extensively researched writing programs used in Australian schools, explicitly builds in individual goal-setting and targeted strategy instruction based on what each student can already do. Studies consistently show large effect sizes when it's implemented with fidelity.

The problem: SRSD done properly is intensive. It requires knowing each student's current level, setting individual goals, and tracking progress at the criterion level across time. With 28 students, that's 28 individual learning plans.

3. The revision loop — marking, fixing, remarking

Research on writing development consistently finds that the ability to revise based on feedback is one of the highest-leverage skills a student can develop. Students who receive feedback and act on it, then receive feedback on their revised work, improve dramatically faster than those who receive feedback alone.

The problem: in a typical classroom, an essay might be marked once per term. There isn't the time — or the human capacity — to run a rapid feedback loop across an entire class.


Why This Is Getting Worse, Not Better

Class sizes in Australian primary schools are larger than the OECD average. Teacher shortages are worsening in key states. Workload intensification is accelerating.

Meanwhile, writing outcomes are declining. AERO's writing instruction work was itself prompted by NAPLAN data showing a measurable decline in student writing performance between 2011 and 2021. Reversing that trend requires something more than better professional development for an increasingly stretched workforce.

This is the structural bind. The pedagogy has improved. The evidence base is strong. The will is there. What's missing is the capacity to deliver personalised instruction at scale.


Where AI Changes the Equation

Until recently, the only way to give 28 students individualised criterion-level feedback on their writing was to pay 28 tutors or work 80-hour weeks.

AI changes that arithmetic.

The emerging evidence on AI writing feedback is cautiously positive. A 2025 systematic review found that AI-generated feedback can produce significant improvements in student writing — particularly in organisation, vocabulary, and sentence control — when it is specific, criterion-level, and tied to explicit teaching. Importantly, the research notes that AI feedback works best when it functions as a complement to teacher instruction, not a replacement for it.

That aligns with what teachers actually need. Not a tool that replaces their professional judgment, but one that handles the scaling problem — giving every student specific, actionable feedback on their individual weaknesses, rapidly enough to support a revision loop.

What this looks like in practice:

  • A student submits a NAPLAN persuasive essay
  • They receive immediate feedback on each criterion: Audience, Text Structure, Persuasive Devices, Vocabulary, Sentence Structure, Cohesion
  • The system identifies their weakest criterion — say, Cohesion
  • It provides a personalised lesson targeting exactly that gap
  • The student practises, revises, and resubmits
  • The teacher can see, at a glance, where each student in the class is struggling — without marking 28 essays individually

This doesn't solve every problem. Human feedback remains richer than AI feedback, particularly for higher-order qualities like voice and argument. The AI gets the rubric criteria right about 88% of the time, not 100% — teacher review still matters. And there are real risks in over-reliance if students use AI as a shortcut rather than a coach.

But it does solve the specific structural problem that makes personalised writing instruction impossible at scale: the mismatch between what 28 students need and what one teacher can provide.


What This Means for Australian Schools

The question facing Australian schools right now isn't whether to personalise writing instruction — the research is unambiguous that it matters. The question is how.

For most schools, the answer has been "gradually, imperfectly, when we can." Differentiated groupings when time allows. Targeted feedback for the students who need it most. Unit plans written for the middle, with extension tasks for the top and scaffolds for those who need more support.

That's not good enough — and teachers know it. They're not failing students through negligence. They're operating within structural constraints that make genuine personalisation at scale impossible without technological support.

AI writing tools represent the first realistic path out of that bind: not by replacing teachers, but by doing the one thing teachers cannot do — providing criterion-level, personalised feedback and targeted practice to every student, every time they write.

The research says this works. The demand is there. The tool exists.

Now the question is whether Australian schools move quickly enough to use it.


Kids Writing AI provides rubric-based writing feedback aligned to NAPLAN, NSW Selective, VIC Selective, HSC, VCE, and General writing — plus personalised lesson pathways targeted to each student's weakest criterion. Educators can use Class Insights to see criterion-level data across their whole class, generate TLC-aligned lesson plans from marking data, and assign writing tasks directly to students. Start free at kidswriting.ai.
This article was researched and written by the Kids Writing team with AI assistance for structure and drafting. All facts, exam criteria, and recommendations are based on published official sources.

Ready to improve your writing?

Try Kids Writing AI — your personal writing tutor, available any time.

Start Marking