Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

Educational sexism in Queensland

By John Ridd - posted Friday, 26 April 2013


The very low standards of education in Queensland are well known: feeble performances in NAPLAN, the statement by the Australian Council for Educational Research ACER that standards have declined by ‘two years learning’ and woeful performances on Trends in International Maths and Science TIMSS should convince most people. For those few who do not accept that fact I suggest that they download and read the research document by the Australian Council of Educational Research entitled ‘A shared challenge’ (ACER 2009). For an easier read I suggest my Through measurement to knowledge.

A further issue is the commonly heard accusation that the assessment system favours girls over boys. If that accusation were to be proven then immediate and drastic remedial action would have to be taken by Parliament to remedy what would be illegal, institutionalised sex discrimination.

The following analysis of Year 12 Overall Position OP compared with results on the Queensland Core Skills Test QCST data demonstrates an anti-male bias in an objective way.

Advertisement

Throughout Years 11 and 12 the students perform various ‘tasks’ set within each school. Those tasks may be formal exams (very rare or non-existent in some subjects), or assignments or, in the case of the sciences, tasks called ‘Extended Experimental Investigations’ and ‘Extended Response Tasks’. Each task is awarded a letter or a number of letters A,B,C,D or E;. Numbers are not used.  At the end of Year 12 each school, for each subject, examines the results for each student and arrives at a set of final results, again given as a letter.

However a result from a ‘hard’ subject taken by a strong group of students, Maths B for example, cannot be compared directly with a result from a subject taken by predominately weak students, Maths A for example. Being in the middle of a strong group is probably as good as or better than being nearly top of a weak group.  Bear in mind that OP stands for Overall Position, it is a dog eat dog affair so the strength of the opposing dogs really matters. The problem of different group strengths is solved by (a) the setting of the QCST, the Core Skills Test which has to be taken by all students who wish to be awarded an OP; (b)a set of statistical analyses.

No analysis is possible using letters, numbers must be used. When all of the assessment is completed (using only letters) the school awards each student, for each subject, a single numerical ‘result’. That number can be seen as a performance indicator and is the number used in the OP calculation. The QCST results are also numerical. It is now possible to standardise each result for each student and hence produce a rank order of all the students over the State. The highest performers get an OP1, then OP2 and so on down to OP 25. OP results are crucial because they determine what courses each student can take at each university. If a course at some university or another can fill all the available places for some course with students with an OP of 5 or better then if a student has an OP 6 she/he cannot enter. Getting the OP right is crucial so systematic bias would be appalling.

I approve of the scaling system using the QCST; the methodology is sound and robust. But, and it is a big But – what if the school results are wrong in some way? GIGO, Garbage In, Garbage Out.

The QCST and its application cannot alter the rank order of results in a subject. If Josephine beats Joseph by getting a higher performance indicator number from the school then after scaling against the QCST she will still beat him but the ‘gap’ may have got bigger or smaller. She will get a better OP than Joseph irrespective of their individual performance on the QCST because that only provides data on Josephine and Joseph’s class, not individual performance.

The QCST is stated by the Queensland Studies Authority, the QSA, to be ‘An achievement test’ and ‘Grounded in the Queensland Senior Curriculum’. I intend to compare QCST results with OP results for females/males. The QCST results, although numerical, are given to the students in the usual A,B,C,D and E structure.

Advertisement

Firstly for 2012: QCST results were, as a number and percentage:

 

Female

Male

 

Number

%

Number

%

A

2025

14.00

2328

20.61

B

4194

29.00

3558

31.5

C

5496

38.00

3688

32.65

D

2696

18.64

1692

14.97

E

53

0.37

30

0.27

 

 

Obvious points are:

·         Number of males is far fewer than number of females, many males have dropped out of the Tertiary relevant subjects.

·         Males beat females by 20.61% to 14.0% for A’s and by 52.11% to 43.0% in the combined A/B. Over all the students 47% were ranked as A and B combined.

·         The male data is skewed upwards, the female downwards.

·         Even allowing for the strong possibility that many of the ‘drop out’ males would have scored poorly on the CST there is no doubting the fact that for those students who took the CST and hence received an OP the males outperformed the females by a very substantial margin.

The crucial Overall Position OP results also for 2012 were:

Female

Male

Number

%

Cumulative %

Number

%

Cumulative %

324

2.20

2.20

378

3.29

3.29

516

3.50

5.70

401

3.50

6.79

631

4.28

9.98

435

3.79

10.58

709

4.80

14.78

504

4.39

14.97

773

5.24

20.02

493

4.30

19.27

895

6.06

26.08

545

4.75

24.02

835

5.66

31.74

630

5.49

29.51

937

6.35

38.09

612

5.33

34.84

939

6.36

44.45

669

5.83

40.67

927

6.28

50.73

674

5.87

46.54

969

6.57

57.30

666

5.80

52.34

917

6.21

63.51

725

6.32

58.66

924

6.26

69.77

710

6.19

64.85

849

5.75

75.52

706

6.15

71.00

731

4.95

80.47

673

5.87

76.87

668

4.53

85.00

605

5.27

82.14

606

4.11

89.11

556

4.85

86.99

481

3.26

92.37

457

3.98

90.97

416

2.82

95.19

392

3.42

94.39

304

2.06

97.25

278

2.42

96.81

22

1.52

98.77

191

1.66

98.47

107

0.72

99.49

93

0.81

99.28

59

0.40

99.89

59

0.51

99.79

17

0.12

100.00

18

0.16

100.00

2

0.01

100.00

3

0.03

100.00

 

 

An inspection of the two data sets is illuminating. 20.61% of the males got an A on the QCST; looking down the OP results for males, we note that to reach 20.61% we have to go down to between OP 5 and OP 6; whereas the only14.0% of females got an A on CST and looking down the OP results for females we reach the 14% between OP 3 and OP 4. This indicates that females are getting a final result that is two OP rungs better than males of similar ability as measured by QCST.

Taking the cohorts that were awarded QCST results of A and B added (roughly the upper half), we find that the 52.11% of males occurs between OP 10 and OP 11, whereas for females their 43% occurs between OP8 and OP 9. Again the males are disadvantaged by two OP rungs compared to females of similar ability as indicated by the QCST.

Both of these results suggest that the system is discriminatory against males by about 2 OP rungs. That is a huge difference in outcomes. There will be vast numbers of males who now miss out on entry to some courses. Note that the OP calibration system using the QCST is both sound and reliable. Hence the problem must lie with the assessment structures within the schools. Unless someone can convince me to the contrary I conclude that presently we have statewide systematic sex discrimination on a huge scale. Of interest therefore is whether or not that discrimination has always existed. Such information might lead to finding the cause(s) of the problem

The data used above came direct from the QSA website. The oldest data of the same variety is for 1992. Applying the same technique to that year’s data the outcomes were:

  • The excess number of female versus males was there again but much reduced (F.14821 M.13107)
  • 17.50% males and 14.48% females achieved an A on CST.  47.48% males and 42.68% females achieved an A or B on CST. The differences are still noteworthy but not as remarkable was the case for 2012.
  • An inspection of the OP results shows that the males 17.5% fell between the 6 and 7 bands. For females their 14.48% fell between the 5 and 6 bands.
  • For the (A+B) CST results, the males 47.48% fell between the 12 and 13 bands. For females 42.6% fell between the 11 and 12 bands.
  • Apparent discrimination against males already existed but at only about one OP rung difference.

When working for my PhD, the topic being Participation in rigorous Maths and Physics… I deal inter alia with claims extant at the time that ‘females were catching up with the males’ in maths and the physical Sciences. Those claims did not look at the female/male cohorts QCST results.

Using the very detailed numerical data for QCST which I obtained from the old Board of Senior Secondary Studies I was able to ‘predict’ how many students would be awarded either a Very High Achievement or a High Achievement. That assumed that QCST results were, for large groups taken as a whole, a useable predictor of results in Physics or whatever.  A brief summary of the outcomes of those calculations compared to actual results was, all for 1992 -the year I used earlier:

 

 

Female

Male

 

Predicted

Actual

Predicted

Actual

Physics

830

827

1712

1711

Maths C

548

588

1311

1276

Maths B

1909

1896

2206

2214

Chemistry

1282

1268

1718

1731

 

 

For the purposes of this article the big points from that set of analyses are (a) that at that time 1992 QCST results were a very good predictor of actual subject results and (b) in 1992 there was no sign of discrimination for or against males in the maths and science subjects.

So, for the three analyses: OP2012, OP1992 and maths/science subjects 1992 we have:

  • 2012 OP results: severe anti male discrimination of a magnitude that is about two full OP bands.
  • 1992 OP results: anti male discrimination of a magnitude that is about one OP band level. That discrimination must have arisen from the inputs from subjects outside maths/science.
  • 1992 subject maths/science results: no discrimination at all.

In terms of the cause of the evident sexist discrimination the 1992 results are interesting because we know that the maths/science results were unbiased but the overall OP was biased. Assessment in maths/science in those days was essentially all formal exams/tests; but the assignment fad had started to penetrate other subjects.

However by 2012 things had changed, assignments as opposed to formal examinations had spread to all subjects and sexism was clear and major in size.

In an earlier OLO article on this topic (Please look at the last page or so of that article) I referred to an old Parliamentary Inquiry Boys: getting it Right which pinned the blame for male educational weakness mainly on over verbosity. It is interesting to note that one of the recommendations from that Inquiry was: Assessment procedures for maths and sciences must as a first requirement, provide information about student’s knowledge, skills and achievement on the subject, and not be a de facto examination of students’ English comprehension and expression.

It is a measure of the overweening arrogance of Education theorists and organisations such as QSA that they not only ignore comments from the highest democratically elected body but continue on unmoved.

All the evidence points the same way: all assignments under any name must be eliminated throughout schooling in maths and science. They lower standards, encouraging bad science (verbosity is very bad science), reduce the number of experiments that are done and they are a part of the institutionalised sexism we see today.

  1. Pages:
  2. 1
  3. 2
  4. 3
  5. 4
  6. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

36 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

John Ridd taught and lectured in maths and physics in UK, Nigeria and Queensland. He co-authored a series of maths textbooks and after retirement worked for and was awarded a PhD, the topic being 'participation in rigorous maths and science.'

Other articles by this Author

All articles by John Ridd

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Photo of John Ridd
Article Tools
Comment 36 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy