My thoughts on the new NAPLAN reports

teachers Jul 12, 2023

NAPLAN results are dribbling through at the moment so I wanted to share some thoughts on the new reporting system, from a parent and educator perspective.

As you may know, the 10 bands that were previously used to report student achievement in NAPLAN have been simplified into 4 proficiency 'levels'. These are categories that have created for parents to better 'understand' the results.

The 10 bands were always, in my opinion, challenging for parents (and teachers) to understand. They placed all students sitting NAPLAN (Year 3-9 students) on the same scale. Each year level was measured across 6 bands. This was confusing for parents.  The 4 proficiency levels are now only in relation to one year level. In the past we could see if a Year 5 student was working at a Year 7 level, now we can just see that they are exceeding Year 5.

However, the language is open to interpretation. The 'Strong' and 'Exceeding' levels mean students are where they should be in relation to their year level. 'Developing' and 'Needs additional support' means they are performing below year level expectations. Yet if we look at the 'strong' category I think it can give parents an inaccurate sense of where their child is sitting. To me it looks that you can be 'strong' but be below the National average (scroll down and see the video on this page from ACARA which shows a sample of the reports). I wouldn't consider a child sitting below the National average as 'strong' in Numeracy or any area. 

Interestingly ACARA have ‘moved the goal posts’ this year. This means that they have set new standards for achievement. Through changing the standards this means that we can no longer compare students results from any of the previous NAPLAN results from 2008-2022 with the 2023 results. The reason for this change is because there are a few things that have changed.

One is that the tests are now completely online. This means they are adaptive (I know for many this was the case last year also) and this suggests that the student results will be more accurate. Instead of all Year 3 students being presented the same items, the items are adapted according to the answers students give to different items. For example, if they answer some 'hard' items correctly in a 'testlet', they will be presented with more challenging questions to more accurately determine their level of understanding. In theory this should increase the accuracy of the NAPLAN results particularly for students at the higher and lower end of the scale.

As such, it is better to have a new scale where accurate comparisons can be made. The two months difference in timing also needs to be taken into account. If true comparisons needs to be made, we need to compare apples with apples. Thus the results have been reset.  

While I understand from a phychometric perspective why this needs to happen, it raises a real issue for schools (more so than parents). For two years schools won't be able to calculate the 'value added'. Value added is when we look at the gains our students have made over the two years since they last took NAPLAN. When schools look at the 'value added', we can compare the gains and see if our instruction/teaching has helped students and year level cohorts to make the expected level of improvement over the 2 years. This is a much better measure than comparing year level cohorts. For example, 2023 Year 3s at a school may be completely different to the 2024 Year 3 cohort (particularly since COVID) so comparing these is not as useful as comparing 2023 Year 3s with the same students who will be in Year 5 in 2025.

At the end of the day student performance will still be their performance. The new levels created by a panel of experts is an arbitrary benchmark to measure against- some may say it is too high, others too low. It is open to interpretation. For me, the data that is most useful to parents and schools is where students fit in relation to the National average and the range of achievement of the middle 60% of students in their year level (the light shaded rectangle on the student report). This data tells you where the student is in relation to other students across Australia. Whether the achievement level of Australian children is at the level it should be (i.e., compared to other countries) is a discussion for another day!

As a parent I have so far been through 6 NAPLAN episodes with my children (actually 5, as Miss 11 ‘missed out’ in Year 3 when they were cancelled due to COVID restrictions). If they continue to be tested in Year 3,5,7 and 9, I have another 10 to go!

I don't usually bother showing my children their NAPLAN results. In my experience the bigger deal you make of NAPLAN the higher stakes it becomes in their minds (cue maths/test anxiety discussions). For me, it is just another tool that schools and governments use to gather data which can be used to inform and make decisions around areas of need. I would love NAPLAN to be a pop quiz that just occurs on a random day (I realise logistically this is not possible!)... there is too much pressure and hype around it. At its core it is a tool to gather data on how the Australian Education system is travelling. 

For me, NAPLAN is important as it provides a window into the achievement of Australian students. Now that NAPLAN is online and adaptive (students taking different paths through the assessment and thus are presented with different items) it is less useful for schools. Once we could look at common areas of need, for example all of our students answered Item 34 incorrectly... what does this mean for our instruction of this content (for example, fractions). But now as students complete different items, common misconceptions at a school level are more difficult to determine.

NAPLAN can provide us with an important piece of data to determine where students are at in relation to State and National benchmarks. Without being able to make comparisons to previous data, the 2023 data set is simply a 'point in time' data.  Considering that point in time was March and we are receiving the results in July, it is effectively obsolete in terms of its teaching value. This data set is therefore more useful for governments and sectors (and parents to know where their child was in relation to children across Australia in their year level in March) rather than schools and teachers.

This week I encourage you to have a chat with someone about the changes to NAPLAN. What do you think? What will it mean for your school? The more we engage in pedagogical discussions with our colleagues the more informed and reflective we can come in terms of our assessment literacy skills.

Have a great week!

Want to learn more from Dr Ange Rogers? Click here to find out about her 'Quality Place Value Assessment in Years 3-6 Mini Course'

Ooh! Tell me more!

Enter your details below to receive weekly blog updates from Dr Ange!