Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

Putting NAPLAN literacy testing to the test

By Elizabeth Grant and Fiona Mueller - posted Monday, 2 August 2010


The 2010 NAPLAN tests of language conventions reveal inherent design flaws. As was the case in the 2008 and 2009 papers (see our On Line Opinion article of March 2010), many items fail to show the relationship of grammar to the conventions of punctuation. This fundamental weakness makes the tests considerably less effective and credible as measuring instruments and as teaching tools.

The major weakness of the tests is that they lack order and coherence. As one Head of English has noted, “They are a random collection of questions on grammar, punctuation, meaning, idiom and instinctive understanding.”

The Year 3 test contained 25 spelling words. It addressed the language conventions in this order: subordinating conjunction; tense; reflexive pronoun; pronoun; capitalisation; punctuation; sentence fragment; comparative v superlative; countable v uncountable nouns; capitalisation; commas; compound sentence; word order; demonstrative pronoun; commas; capitalisation; apostrophe; meaning; modal with a past participle; adjective; apostrophe.

Advertisement

The Year 5 test contained 25 spelling words. It tests the following grammar and punctuation items in this order: reflexive pronoun, meaning, comma, direct speech, spoken v written usage, comparative v superlative, capitalisation, commas, subject-verb agreement and spelling, use of brackets, defining phrase (tense), punctuation, brackets, apostrophe, modal verb, subjunctive mood, compound sentence (with coordinating conjunction), apostrophe, comparative and superlative, subject-verb agreement (with neither-nor), knowledge of conjunctions, main clause (plus independent clause creating complex sentence).

The range of questions is limited. The tests do not address the many errors that characterise students’ written work throughout their schooling, and which are most detrimental to fluency, such as the run-on sentence and the sentence fragment. Some items appear to be testing multiple points simultaneously. Other questions are written in ways that rely on native speaker intuition, or common sense and logic, rather than a solid grasp of how the English language works. Many items test usage rather than conventions. The type of language used to frame the questions is inconsistent, sometimes referring to a part of speech by its appropriate name, and at other times asking simply for the correct “word/s”. The random nature of the tests is reflected in the inclusion of a question on the subjunctive in Year 5.

The use of grammatical terminology is also problematic. With regard to punctuation, the only specific terms are commas and apostrophes. The only parts of speech that are mentioned are adjectives and conjunctions. In every other item, students are simply asked to choose “which word” completes the sentence. If students are expected to learn and to use the metalanguage in other subjects such as mathematics, music and geography, why is this not the case in English?

The instructions are frequently confusing. In item 37 of the Year 5 paper, students are asked “Which words and punctuation correctly complete this sentence?” The sentence reads: “I collect model cars and ________________ do you collect, Jasper?” Felicity asked.

Possible answers are:

a) shells what
b) shells. What
c) shells, what
d) shells? What

Advertisement

The answer is b). However, this creates two sentences, not one, making the test question inaccurate.

A further inconsistency can be found in item 42 in the Year 5 test. The question requires students to complete the statement “I have lost my bag but my keys are in my pocket _______________.” Possible answers are:

a) luckily I can still drive home
b) it is lucky I can still drive home
c) so luckily I can still drive home
d) because it is lucky I can still drive home

The answer is c). However, this response contains an error as the language convention requires a comma before the coordinating conjunction “so”. The tests are inconsistent in their application of this rule. For example, in item 28 of the Year 3 test, it is clear that the test writers understand that a compound sentence using the conjunction “so” is preceded by a comma (It was a secret, so he promised to keep it to himself.). Why is the rule not applied in the Year 5 test item? How should teachers guide their students in this regard?

Item 31 of the Year 3 paper asks students to identify the correct punctuation for the sentence “________________ they caught something big.” Possible answers are:

a) one day
b) one day,
c) One day
d) One Day,

All of these answers are wrong. The correct answer should be “One day,” because “one day” is an introducer, which must be followed by a comma.

Similarly, the rule is not applied properly in item 27, in which students are asked “Which words correctly complete this sentence?”: Tomorrow we ___________________ to the park. “Tomorrow” is an introducer, so it should be followed by a comma. Should teachers ignore this convention?

Item 47 of the Year 5 test asks “Which group of words can all be conjunctions?” The choices are:

a) his, hers, its, theirs
b) after, before, while, then
c) plays, speaks, drinks, eats
d) may, must, should, might

The obvious answer is b) because in a) the words are all pronouns, in c) they are verbs, and in d) they are modal verbs. However, the word “then”, which is included in b), is an adverb, not a conjunction. If two main clauses are joined by “then”, the word “then” must be preceded by a conjunction such as “and” or “but” or “because”. Otherwise, a run-on sentence is created, which is grammatically quite incorrect, but a common structure produced by students. Importantly, the other words in b) can perform the function of an adverb, a preposition or a conjunction. The word “then” can only act as an adverb. Have the item writers perhaps confused “then” with “than”? The latter is also a preposition and a conjunction, and is also frequently misused.

According to a recent article in Education Review (May 2010), every test item “is reviewed by every state and territory as well as experts in indigenous education, experts in education for students from language backgrounds other than English, students who have a visual impairment and other experts in teaching and learning for students with disabilities.” The statement says that “The purpose of this review is to make sure that the final set of test items that are printed and delivered to schools are accessible to the widest possible proportion of students.” Numerous items in the tests do not make their pedagogical purpose clear, or appear to be testing points of grammar that are likely to be most accessible to students from native English speaking backgrounds and/or families that are able to encourage more sophisticated language usage.

In the Year 5 test, item 41 asks students to identify “Which word or words correctly complete the sentence?” The sentence is: It is requested that all phones ___________ turned off during the show.” The choices are:

a) be
b) being
c) are being
d) have been

The answer is a). Is this question testing the use of the subjunctive? How many students in Year 5 would be aware of the correct application of this mood? How would students be taught this? It is interesting to note that the word “are” is not one of the options, even though the indicative is common usage, particularly in spoken English.

The major drawback of these tests, however, is the random selection of aspects of grammar and punctuation. Whereas all of the spelling questions are placed together, there appears to be no logical order to the other questions. For both Year 3 and Year 5 students, for example, a question testing the comparative versus the superlative is followed by a question asking students whether or not to capitalise a proper noun. The lack of order means that teachers and their students are unlikely to be able to use the tests as teaching tools as there is no systematic assessment of particular aspects of grammar and punctuation. How is this, then, a collection of “rich diagnostic data”?

Some test items use misleading terminology. In the Year 3 test of language conventions, for example, item 34 asks “Which sentence is correct?” Possible choices are:

a) The children on the oval playing football.
b) The children playing football on the oval.
c) The children were playing football on the oval.
d) The children who were playing football on the oval.

The answer is c). This question must be testing students’ knowledge of what constitutes a complete sentence because three of the four responses are sentence fragments. Yet, the question itself asks “Which sentence is correct?” There is only one sentence here.

The new Australian curriculum places a strong emphasis on the development of skills in editing and proofreading. The structure of the NAPLAN tests of language conventions gives students little opportunity to demonstrate such skills. For example, in the items that require students to insert commas to corral words, phrases or clauses, a number of possible locations are given. Why? In editing their own work, students should have the capacity to identify not only which punctuation is required, but where it should be placed in a sentence.

According to one lecturer in language education, “In order for student to be able to discuss specific grammatical concepts and language use within a text, it is advantageous to provide them with a standard grammar vocabulary. The language for talking about, and describing, language is referred to as ‘metalanguage’. Developing students’ metalanguage will better equip them to engage in text and grammar analysis and dialogue, leading to the improvement of the structural aspects of their written texts, In order to develop students’ metalanguage, the teacher needs to use the terminology consistently an regularly, whenever text grammar discussions take place (‘Grammar knowledge and students’ writing’, in Curriculum Leadership Journal, Vol 5, Issue 24, July 2007).”

The Education Review article by ACARA also claims that “Teachers use the results of common testing to identify weaknesses and strengths in individual students and within groups. This feedback enables teachers to focus their teaching and learning programs in the future.” If the ultimate goal to support the development of students’ skills in the application of language conventions, there is no room for error in the collection of what ACARA calls “rich diagnostic data”. The design and content of the tests must be clear, accurate and useful. This is not the case at present.

In establishing a stand-alone test of language conventions, the education authorities are identifying spelling, grammar and punctuation as essential aspects of language acquisition. A valid test instrument for students in this area must be absolutely clear in its structure and purpose. It must set high standards, complement the expectations of the reading and writing tests, and provide the best possible guidance for teachers.

An analysis of the 2010 NAPLAN tests of language conventions for Years 7 and 9 will be provided in a subsequent article.

  1. Pages:
  2. 1
  3. 2
  4. 3
  5. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

3 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Authors

Elizabeth Grant BA, Grad Dip (TESOL), MA (TESOL) worked with the Department of Foreign Affairs and Trade for over 20 years before moving to Seoul and then Shanghai to teach English as a Second Language. Since 2002, she has been based in Canberra, co-ordinating and teaching English language and communication skills programs for university students. In 2005, she participated in a major research project to investigate undergraduates’ perceptions of the extent to which their experience of English in K-12 prepared them for their tertiary courses. Liz’s professional experience in Europe, Asia and Australia has made her very aware of the value of language awareness training for both native and non-native speakers of English.

Dr Fiona Mueller is a teacher of English and foreign languages and a former Head of ANU College at the Australian National University. In 2016-2017, she was Director of Curriculum at the Australian Curriculum, Assessment and Reporting Authority (ACARA). She is particularly interested in the history of education, international education, single-sex schooling and K-12 curriculum design.

Other articles by these Authors

All articles by Elizabeth Grant
All articles by Fiona Mueller

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Article Tools
Comment 3 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy