Numbers must add up for faith in NAPLAN

PERHAPS it’s because we’ve all been to school that we all have such strong opinions on education.

Whether it’s the debate over private versus public school, what should and should not be included the curriculum, or whether Higher School Certificate students are put under too much pressure, few areas of public life can guarantee the same passionate responses.

This week it’s another old chestnut back in the headlines – NAPLAN.

The National Assessment Program – Literacy and Numeracy tests were introduced a decade ago and not a year has gone by since without some controversy.

The perennial argument is over the value of publishing schools’ NAPLAN figures each year, which has inevitably led to the creation of “league style” ranking tables.

Supporters say that any information is good information, and that parents deserve to know not only how their school is performing but also how it is performing in relation to others in their city or region.

Detractors say the rankings merely pit the “haves” against the “have nots”, simply reflecting the discrepancy in resources from school to school rather than providing any meaningful insight into the relative quality of teaching.

This year’s controversy, though, centres on the actual running of the exams.

One-in-five schools across the country allowed their students to do the NAPLAN tests online as part of a move to all online testing down the track.

As the results were collated, though, concerns were raised over the differences in marks between schools that took the tests online and those who used the traditional pen and paper.

It appeared students who did the online tests were at a significant advantage, though even those findings have been disputed within the industry.

From the outside it’s impossible to know just how accurate any conclusions might be, but there is no doubting the significance of the debate.

The best schools rely on NAPLAN data to tailor their teaching, so it’s vital that those statistics can be trusted.

Any discrepancies in the figures erode trust in the data, erode trust in the system, and erode confidence in the schools’ responses.

NAPLAN is nothing without the data that comes from it. If this mess isn’t sorted out, then we may as well drop it altogether.