Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.

 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate


On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.


RSS 2.0

What is actually being tested by NAPLAN?

By Elizabeth Grant and Fiona Mueller - posted Monday, 22 March 2010

What exactly do the National Assessment Program - Literacy and Numeracy (NAPLAN) tests of language conventions achieve? In a recent interview, the Australian national curriculum project leader, Professor Barry McGaw, reflected that “We do have in Australia now a common assessment of literacy and numeracy, but that’s a common assessment designed in the absence of a common framework. With a national curriculum, we have the prospect of a common framework to shape the tests.”

An analysis of the 2008 and 2009 NAPLAN papers reveals contradictory messages, which are indeed evidence of the absence of a common framework.

In particular, the tests of language conventions lack coherence and order. Although all of the spelling questions, which constitute approximately 50 per cent of each test, are placed together, albeit not in a separate section, the items that test grammar and punctuation do not appear to follow any logical pattern. How are teachers expected to teach and their students expected to learn from test papers whose nature and purpose are not clear?


Are all schools encouraged to undertake a forensic analysis of the tests and to discuss strategies for teaching the syntactical points? What support material will be provided that clearly identifies the areas to be covered in future tests and the ways in which these may relate to the national curriculum?

Notably, the tests’ own use of language is problematic. In the 2008 practice tests, specific terminology was used in the questions relating to spelling, punctuation and grammar (i.e. In the first sentence, the word “a” is “a noun”, “a definite article”, “an indefinite article”).

However, in the 2008 and 2009 papers, only the questions relating to spelling and punctuation used the relevant terminology (i.e. Which of the following has the correct punctuation? OR Where should the commas be inserted?). If the questions related to grammar, no explicit terminology was used. Instead, the rubrics simply asked “Which word(s) correctly complete the sentence?”

What was the rationale for this shift?

The questions in the 2009 tests are clearer than those in the 2008 version, at least with regard to which points of grammar or punctuation are being tested. However, there are items in both the 2008 and 2009 papers that are extremely unclear in that they appear to be testing multiple points simultaneously or are written in a way that relies on native speaker intuition rather than a sound grasp of how the English language works. In addition, the range of questions is extremely limited.

Our work with employers, schools and tertiary institutions has enabled us to compile a list of the errors that students are most likely to make, and which frequently require remedial study. At the top of this list are the run-on sentence and the sentence fragment. In the 2008 and 2009 NAPLAN papers, the former is difficult to find, and the latter is not tested at all.


The practice questions for 2008 gave an indication of the format of the tests, but provided little guidance of the specific grammatical concepts to be examined. In one practice question, which was the same for Years 3, 5, 7 and 9, the answer depended on native speaker intuition. The question was “Do you have ______ pet?” Possible answers were “a”, “if”, “he” and “she”. What is actually being tested in such a question - the indefinite article, a subordinating conjunction, or a personal pronoun? It would appear that the indefinite article is being tested, but there is only one logical choice. This is a process of elimination that would not require students to distinguish between the correct and incorrect indefinite article.

Similarly, in the Year 5, 2009 paper, item 31 asked students to complete the sentence “Jo likes to listen to music ______ she is cleaning her room”. Possible answers were “even”, “after”, “while” and “during”. “Even” could take the role of an adjective, verb or adverb. The option “after” acts either as a preposition or an adverb, “while” is an adverb, and “during” is a preposition. How does this test a student’s understanding of grammar?

In the Year 3, 2009 paper, item 43 asked the students to complete the sentence “The boy put on his shoes ______ he tied his laces”. Possible answers were “next” (an adverb), “and so” (two coordinating conjunctions), “because” (a subordinating conjunction) and “and then” (a conjunction followed by an adverb). In such questions, the advantage is likely to lie with the student who has had the greatest exposure to such everyday phrases.

  1. Pages:
  2. Page 1
  3. 2
  4. All

Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

4 posts so far.

Share this:
reddit this reddit thisbookmark with Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Authors

Elizabeth Grant BA, Grad Dip (TESOL), MA (TESOL) worked with the Department of Foreign Affairs and Trade for over 20 years before moving to Seoul and then Shanghai to teach English as a Second Language. Since 2002, she has been based in Canberra, co-ordinating and teaching English language and communication skills programs for university students. In 2005, she participated in a major research project to investigate undergraduates’ perceptions of the extent to which their experience of English in K-12 prepared them for their tertiary courses. Liz’s professional experience in Europe, Asia and Australia has made her very aware of the value of language awareness training for both native and non-native speakers of English.

Dr Fiona Mueller is a teacher of English and foreign languages and a former Head of ANU College at the Australian National University. In 2016-2017, she was Director of Curriculum at the Australian Curriculum, Assessment and Reporting Authority (ACARA). She is particularly interested in the history of education, international education, single-sex schooling and K-12 curriculum design.

Other articles by these Authors

All articles by Elizabeth Grant
All articles by Fiona Mueller

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Article Tools
Comment 4 comments
Print Printable version
Subscribe Subscribe
Email Email a friend

About Us Search Discuss Feedback Legals Privacy