Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

Why ICSEA fails our schools

By Mike Williss - posted Thursday, 22 April 2010


ICSEA is the Index of Community Socio-Educational Advantage constructed by the Australian Curriculum, Assessment and Reporting Authority (ACARA). The latter was tasked by the Rudd Government with constructing the controversial My School website.

Let me state my bias about ICSEA. The only honest thing about it is the word “community”.

Gillard prates about ICSEA being the mechanism that allows the public comparison of “like schools” (since refined to “statistically similar schools”).

Advertisement

Each and every Australian school now has an ICSEA value. The mean value is 1000. Schools above this are declared to be more advantaged; those below, less advantaged.

But ICSEA is not an accurate assessment of school similarity.

School data is not used to construct ICSEA values.

The data comes exclusively from what the Australian Bureau of Statistics calls Census Collection Data sets (CCDs). These are the approximately 220 households assigned to an Australian Bureau of Statistics (ABS) data collector on the occasion of a census. The ABS averages the data of these 220 families to construct four indexes of socio-economic data. Altogether, 35 pieces of data (or variables) are used between these four indexes.

ACARA discarded 20 of those variables, arguing that whilst they correlated for economic disadvantage, they did not assist in determining educational advantage. The remaining 15 were judged to be significant in relation to educational advantage, although one was subsequently dropped for being “below statistical significance”.

A regression analysis was then used to devise a mathematical equation into which the 14 variables are fed. Only at that stage were two additional variables (the percentage of Aboriginal and Torres Strait Islander enrolments, and a measure of CCD “remoteness” ) applied to the value. The first of these is really the only piece of school data that is built into the ICSEA values.

Advertisement

Thus, ICSEA values, for all intents and purposes, are measures of quite small communities. That is why ACARA is at least honest in stating that it is an index of communities, not an index of schools.

I repeat, it is not ISSEA. It is not an Index of School Socio-Educational Advantage.

However, that is what Gillard tells parents that it is.

She has stated on more than one occasion, and the ACARA website is based on this myth, that the ICSEA values for the first time allow the comparison of statistically similar schools.

This requires some further explanation.

Most people are aware of some of the glaring anomalies in lists of statistically similar schools compiled by ACARA and published on My School. Adelaide’s Sunday Mail revealed the odd pairing of elite Prince Alfred College with rural East Murray Area School, with accompanying pictures of the former’s feudal bluestone castle and the latter’s shabby transportable.  The Prince Alfred College principal was at a loss to explain the pairing, and at even greater loss to explain why his college was not statistically similar to its traditional rival, neighbouring St Peters College.

What the paper did not pick up was the less glaring, but equally damaging, pairing of St Peters College with southern suburbs’ Blackwood High.

Blackwood High is nestled in the southern foothills, surrounded by a community that includes the reasonably affluent alongside the somewhat socially stretched.

According to its principal, 56 per cent of students eligible to enrol at Blackwood do not do so. They come from families wealthy enough to enrol at private schools. Each CCD in the vicinity of Blackwood High is a microcosm of social diversity. The sole supporting mother lives a street away from the wealthy businessman and his professional partner. Their household incomes are averaged. The children from the former take into Blackwood High the same average of ABS data that the children of the latter take down the hill and into Scotch College. The household income of students at Blackwood tends to be overstated; likewise, the household income of students at the various private colleges who come from the Blackwood CCDs tends to be underestimated.

The schools are not compared through ICSEA on the basis of real parental income as stated by Gillard in her many media interviews. This is certainly not denied, but acknowledged by ACARA in its own technical paper which explains ICSEA on the My School website. I repeat, it is acknowledged in the word “community” in ICSEA.

The damage that is done to the Blackwoods of the public school system is that they have higher ICSEA values than they should have and are therefore grouped and compared with schools that serve students with greater educational advantage. The acne of pinks and reds that festoon the NAPLAN results for Blackwood against the so-called similar schools on its My School page are an indictment of Julia Gillard and the damage that she is doing.

Gillard has justified her misleading ICSEA by saying that when she came to office, she did not have access to data that identified the existence of socio-economically disadvantaged schools. Later she conceded that she did have it for Catholic and private schools. She could easily have obtained it for public schools from her State and Territory ministerial counterparts.

The South Australian education department established an Index of Educational Disadvantage (IoED). It was created in 2000 to replace a previous index, the Weighted School Card Index. The IoED places schools into one of seven categories, with category one being most disadvantaged and category seven being most advantaged.

The SA IoED uses only four components and they contribute approximately equally to the overall score.  Two are drawn from ABS CCDs (household income and parental education and occupation, again, averaged across the families in the CCD). Two are school-based data sets: Aboriginality and student mobility.

Constructed out of different variables, there is some overlap and mismatch between IoED and ICSEA.

My point is that Gillard did not need ICSEA to get a list of public schools with socio-economic disadvantage from SA.

Gillard still justifies My School and ICSEA, however, for identifying “struggling schools we didn’t know about” (TV interview with Barrie Cassidy, Insiders Program, April 11, 2010). She refers to “110 schools benefiting out of our $2.5 billion of new money and new reforms to help schools that are struggling.”

Actually, these 110 schools share in $11 million, or around $100,000 each. The money is welcome. It is probably not enough for any school wanting to assist students one-on-one with tutoring and mentoring. And in any case, it still does not identify schools where the most socially and educationally disadvantaged students are enrolled.

The schools were selected because they were worse off in relation to NAPLAN results for all year levels against both the national average for all schools and the statistically similar schools.

Not one Category 1, 2 or 3 IoED school in SA, that is those in the bottom three categories of socio-economic disadvantage, were identified for the additional funding. Four were Category 4 (mid-range) one was Category 5 and one was Category 7 (most advantaged). I have compared the NAPLAN results of these schools with a sample of Category 1, 2 and 3 schools, and the latter all had lower scores.

The Deputy Prime Minister has not had to explain to parents of students at SAs remote Aboriginal schools, or to parents of students in depressed parts of the northern and southern suburbs, the statistical magic that obscures their much greater educational disadvantage from those dispersing much-needed direct additional funding from the Commonwealth.

If a statistical formula could be devised that measured educational advantage of students at individual schools, it would perhaps start with the greatest measure of what constitutes an educational head start: the time spent by parents reading to and with young children.

Gillard knows this: “Mum had made sure that both Alison (sister) and I could read and write before we went to school. So we got a flying start,” (interviewed on Australian Story, 6 March, 2006).

The percentage of students, Birth to Year 3, who had parents or older siblings read to them for at least two hours per week would be one of my starting variables.

So would class size, although Gillard prevaricated about this when asked by one astute journalist (Madonna King, ABC Brisbane 12 August 2008) who gamely pursued the matter across several questions.

So would the percentage of teachers teaching outside of subject areas for which they are trained. (Nor is this taken into account in the new Standards for the Teaching Profession with its categories of graduate, proficient, highly accomplished and leader - although these will no doubt find their way onto My School.)

And to get a tad esoteric - how about the ratio of total teachers to full-time equivalents (FTEs) per school? The ratio at Blackwood is a low 1.06 to one, whereas at St Peters College it is 1.28 to one, which I suspect is a measure of the greater employment of specialised teaching staff at the latter.

Or the ratio of FTE non-teaching staff to students? At Blackwood it is a high 53.4 per student, whilst at St Peters it is a low 18 to one. Leaving aside the curation of the cricket pitch at St Peters, perhaps students at that school have an educational advantage in that their teachers are better supported in the classroom by teaching assistants, or carry less of a burden and distraction in the form of administrivia, leaving them free to concentrate on their core task of teaching.

Should student mobility be a variable? It is for SAs IoED, but is not for ACARA despite Gillard deeming it to be such a huge influence that it justifies the creation of a “discrete student identifier” (ID number) to track students across their various schools on My School.

How about really using parental income, instead of an average of 220 families’ incomes?

And let’s not forget the variable dropped by ACARA from ICSEA because it “did not correlate highly enough with student achievement”, namely, “the percentage of people who do not speak English well”! That obviously has no correlation with the four sets of NAPLAN results that are based exclusively on English literacy!

So, ICSEA does not establish a basis for comparing like schools.

It does not use school data.

It is a community index and it should not be used to compare, rank and judge schools.

The sooner Gillard takes ACARA back to the interactive whiteboard to start all over again, the better.


 

  1. Pages:
  2. 1
  3. 2
  4. 3
  5. 4
  6. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

5 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

Mike Williss is a teacher of Chinese in South Australia. After 32 years in the classroom , he now works for the Australian Education Union in South Australia.

Other articles by this Author

All articles by Mike Williss

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Photo of Mike Williss
Article Tools
Comment 5 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy