The insanity of NAPLAN

eslfuntaiwan / Pixabay

 

Eleven years ago the Government of Australia decided they had two main goals for the upcoming decade.  The first being that education provided across Australia is excellent and that there is equity between a regional school at the back of Bourke and one in ritzy Rose Bay.  The second ambitious goal was that the young people of Australia would be successful in their learning, be confident and creative in their endeavours; and personally my favourite, active and informed citizens.  It was also around this time that the politicians decided that standardised testing would show clearly which schools were producing excellent learners and which schools were not. NAPLAN’s birth occurred during this maelstrom.  

NAPLAN is Australia’s reaction to a standardised national testing scheme.  The ideology of standardised testing is that data provided would illustrate to educators the efficacy of policies and practices implemented within schools.  This practice is used internationally to determine education trends across the world and promote conversation to improve best practice (Jackson et al., 2017).  An example is the PISA test, that is used around the world to identify shifts in education. As assessments are evidence of learning, a standardised test identifies what is taught and learned in a classroom across the nation (Jackson et al., 2017). The results are to be used as a tool to direct teaching policies.  What it is not supposed to occur, are the results being used to marginalise and discriminate against poor performing schools and their struggling teachers. Unfortunately, the reality of standardised testing has blown up in everyone’s face.

 

NAPLAN – also affectionately known as the Devil’s tool by some disgruntled teachers of my acquaintance, was the brainchild of the Howard and enacted by the Rudd government.  Following closely on the heels of the new national curriculum, its inception was based on determining which schools were successful in addressing literacy and numeracy outcomes; and which schools needed more assistance.  After all, whilst ideally we would all like our students to come to school from homes where books and breakfast are the norm, the reality is definitely not Utopian. Therefore if education is to be based in equity then there needs to be a measurement of some sorts to determine which schools fall short of this prescribed ‘line in the sand’ so extra funding and assistance can be provided to those schools that require it.  This funding system, is unlike the US of A. The schools within the USA have their funding linked to local property taxes and thus more affluent areas receive MORE money than lower socio-economic areas. For a country that insists it does not have a class system, it is doing rather well to perpetuate one.

ACARA flexed its new muscle back in 2008 and assured the eager masses that NAPLAN would place all students in Australia on a single scale of measurement, and thus map their skills and understandings across their schooling years (Fachinetti, 2015).  Testing was already occurring across the nation within states so a national testing system seemed appropriate. With seven different education systems and proportionally a small population, it seems logical to have the one system to determine which states and electorates were performing well in addressing a new national curriculum and which states, and more specifically schools needed additional funding.   Instead the advent of NAPLAN only sought to increase the competitive streak between students and between schools. Schools with low NAPLAN results were often demonised by the media and that often lead to many of them losing student numbers, resources and becoming institutions of failure (Zyngier, 2011). 

The arrival of the MySchool site only further exacerbated an already tense situation.  Instead of the promised transparency for parents, it instead just proved to be controversial and and downright destructive to many schools already struggling with teaching and learning practices (Fachinetti, 2015).  The league tables over simplified learning outcomes and allocated them as red or black. It did not indicate schools where great improvement occurred. It only highlighted who won. Quite frankly, the whole idea is contraindicated to the tenets of the 2008 Melbourne Declaration, whose first primary goal is to provide excellent and equity in schooling.  

Educationally NAPLAN is supposed to be low stakes, in that test scores are to be used for identifying and improving teaching and learning practices rather than being used as a method for reward and punishment.  Facinetti (2015) describes the nation wide testing program having evolved into a high stake test in which students are coerced to perform by often well meaning parents and teachers. Teachers are often railroaded into teaching explicitly for the test rather than holistic learning to maintain or improve school scores.  Parents are intimidated by MySchool results and or societal pressures and send their precious moppets for NAPLAN tutoring.  The surfeit of preparation booklets in the supermarkets are just a snapshot into general society’s view of this test.  Sadly, there are numerous high schools across the country that request for NAPLAN scores as part of the application process.  

NAPLAN has completely failed in achieving its target.  Ideally, the data could be used to improve teaching practice, but as the results were not out till four months later, it was often too late to implement changes.  Granted the new online system will enable results appear quicker but online testing comes with its own baggage. But the single most infuriating aspect of NAPLAN is that it is not connected to the curriculum.  It truly boggles the mind how a NATIONAL standardised testing scheme does not actually look to see if the NATIONAL curriculum is being implemented properly across the stages.  So why on Earth do we force our kids and our schools to complete this test?  Ah yes. For funding. ACARA has a lot to answer for.

MahuaSarkar / Pixabay

 

References

ACARA (2008) National Assessment Program Literacy and Numeracy. Retrieved from https://nap.edu.au/_resources/2ndStageNationalReport_18Dec_v2.pdf

Biddle, B., and Berliner, D., (2002) A research synthesis. Unequal school funding in the United states.  Educational Leadership. 59: 8 pp48-59.  Retrieved from http://www.ascd.org/publications/educational-leadership/may02/vol59/num08/Unequal-School-Funding-in-the-United-States.aspx

Fachinetti, A., (2015) A short personal and political history of NAPLAN. Education Today. 4. Pp.20-22.  Retrieved from http://www.educationtoday.com.au/_images/articles/pdf/article-pdf-1126.pdf

Jackson, J., Adams, R., and Turner, R., (2017) Evidence based education needs standardised assessment.  The Conversation. Retrieved from https://theconversation.com/evidence-based-education-needs-standardised-assessment-87937

MCEETYA (20019) MCEETYA four-year plan 2009 – 2012. Retrieved from http://scseec.edu.au/site/DefaultSite/filesystem/documents/Reports%20and%20publications/Publications/National%20goals%20for%20schooling/MCEETYA_Four_Year_Plan_(2009-2012).pdf

Munro, J., (2017) Support for standardised tests boils down to beliefs about who benefits from it. Retrieved from https://theconversation.com/support-for-standardised-tests-boils-down-to-beliefs-about-who-benefits-from-it-86541

Zyngier, D., (2011) Unfair funding is turning public schools into ‘sinks of disadvantage’. The Conversation. Retrieved from https://theconversation.com/unfair-funding-is-turning-public-schools-into-sinks-of-disadvantage-751