Take a good dose of robust ideology — a laudable conviction that every child should benefit equally from education. Add some imported rhetoric reflecting notoriously failed school accountability reforms from the UK and US, and a dash of imported expertise (New York’s City schools chancellor Joel Klein). Combine this with a good dollop of political nous and the usual paucity in relevant background and outcomes research for the education ministry portfolio.
Then beat it all up within a large budget, spicing with a generous quantity of obtuse statistics. Half bake the final product and glaze it with a colourful presentation — and you have the recipe for Julia Gillard’s MySchool website.
In the ongoing furore from many education professionals over Gillard’s responsibility for the publication of NAPLAN test results on the MySchool website, six months on, the basic argument seems to have long been lost in translation.
The website displays results of the National Assessment Program in Literacy and Numeracy averaged by school, and was launched on January 28 with great fanfare about transparency in school performance indicators. An information filter of media confusion and disinterest in the rationale behind the publishing of these results has helped ensure that MySchool may now appear a resounding success.
Recently, software alterations made it a little less easy for further simplistic school league tables to be generated directly from the website for tabloid publication — a concession that perhaps, at last, showed some acknowledgement from the federal education ministry that there could be a problem. The problem, however, is that MySchool, now “tweaked” and promoted as supposedly comparing apples with apples and oranges with oranges, remains a lemon.
The front page of the SMH July 3 Julia Gillard appeals to voters “Judge me on how I do the job” and lists the MySchool website among her achievements when minister. This article has been placed alongside a report highlighting the fact that students from Asian backgrounds are dramatically outperforming students from English-only speaking households in selective high school entry tests, a fact that would be no surprise to most teachers.
Although the irony may be easily missed, the assumptions underlying MySchool comparisons stand juxtaposed to many accumulated years of similar evidence that socio/cultural factors account for far more of the variance in averaged student test results than the quality of teaching purportedly highlighted through Gillard’s MySchool website.
A hallmark of federal Labour’s performance has been an unprecedented attempt to provide comparative measurements of previously unquantified information. In deference to our public right to informed choice, we were to be provided with comparative grocery prices and comparative petrol prices. It might then have seemed a small leap to provide parents with comparative information on “school performance”, presumably then enabling them to make an informed choice about their children’s schools.
Undoubtedly, many parents would have welcomed the display of statistics that superficially may seem to mean something valid and informative. To facilitate lay interpretation of NAPLAN results, all schools supposedly performing above average are distinguished by green highlighted NAPLAN results (green for “GO”?). An Index of Community Socio-Educational Advantage (ICSEA) is claimed to identify “statistically similar schools” using data from the 2006 census, but can at best use only a small sample of potentially relevant variables (such as proportion of indigenous students but not proportion of Asian students). Moreover, the invalid assumptions underlying the ICSEA ranking severely skews NAPLAN comparisons in favour of already advantaged schools.
For example, during the Howard years, higher-performing students from higher-income families increasingly “followed the money” in the accelerating drift to private schools, so that now higher-income families choose private schools at double the rate of lower-income families. However, ICSEA ratings assume homogeneity of socio-economic status for all students within each (2006) census area and therefore assume a statistically “random” choice of school for any particular family.
By definition, only half the schools on MySchool can achieve above the average; and (surprise, surprise) overwhelmingly these are the socio-economically advantaged schools and those schools better able to select their student intake (such as “coached” and high achieving Asian students). These advantages would include many diverse variables such as maternal nutrition, the access to quality pre-school education and early supervised internet access.
Six months on, as most educational researchers predicted, given the contextual vacuum and lack of statistical transparency attendant with their publication, MySchool NAPLAN results have become a catalyst for misinformation and prejudice against those 50% of students from schools with largely red highlighted results.
On ABC’s Q&A on August 6 last year, Julia Gillard had stated “let’s just get each school’s results out there for all to see, so that we can see which schools might need more help”. So far we have not seen more help. What has been evidenced is some media beat-up about a few teachers supposedly tweaking class results, school principals reportedly fearing being sacked, tests being opened early and answers supplied, and many children with learning difficulties not sitting the tests.
NAPLAN test administration directives are to be tightened, which will do nothing to ease any growing paranoia among those teachers and principals whose students perform, for whatever reasons, below the Australian school average. Moreover, while Gillard’s ideology may hold merit, the assumptions reflected by her statement beg more than a little critical thinking. The assumptions seem to be that a school’s overall performance can be measured by averaged NAPLAN student test results for that school, and under-performing schools can therefore be identified through their lower than average NAPLAN test results.
The problem here is so obvious that it seems often overlooked. Gillard understands that it is not whole schools but individual students who “perform” academically in literacy and numeracy tests and that student results reflect a host of complex variables, only one of which is teacher performance. However, apparently it is less obvious that the above assumptions are a quantum leap into a quagmire of potential confusion and misinformation for parents.
Essentially meaningless “damn lies and statistics” commonly influence public perception and decision-making. This happens because, in a strange quirk of our collective Western psyche, we pervasively tend to ascribe intrinsic validity to comparative information when mathematically expressed. The trouble with MySchool is that the more meaningful the construct of “school performance”, the more complex and multidimensional, and therefore simplistically unquantifiable. The comparative “performance” of public hospitals could hardly be validly measured through league tables of births and deaths per population demographic, not even for “statistically similar” hospitals.
However, given some reductionist terminology, the definition of “school performance” has lost complexity and therefore become statistically measurable. Before the launch of MySchool, Julia Gillard was studiedly careful in attempting to reassure all stake holders that the federal government’s publishing of NAPLAN test results were not to be used by media outlets to feed prejudicially presented simplistic comparisons between schools (aka league tables, which, of course, then ensued). So what, then, are the results to be used for? Considerable paranoia among teachers in disadvantaged schools has been understandable, when nothing has been made clear about just what school averaged test results should be interpreted to mean, what these can be used for, and the assumptions behind their measurement.
A problem with simplistic school performance comparisons is the virtual impossibility of truly comparing like with like. Contrary to claims on the MySchool website, it is virtually impossible to include enough relevant variables to ascertain which school groupings are truly “statistically similar”. In fact, even regarding something as ostensibly measurable as comparative prices for groceries, to actually compare “apples with apples” and “oranges with oranges”, far too many variables had to be controlled for to allow the Rudd Government’s grocery choice website to ever have got off the ground.
Were we to simply average the price of all individual grocery items in an Aldi store, then compare this average with that similarly obtained from a Woolworths store in an adjacent suburb, the public would hardly have been duped. Variables omitted obviously include the quality and quantity of each type of grocery item. Such a comparison then, would clearly be ludicrously meaningless were it to be portrayed as any sort of measurement of the relative “performance efficiency” of these two supermarkets, and would tell the consumer absolutely nothing about which supermarket might be selling the best priced Granny Smiths.
Yet something rather alarmingly akin to this supermarket comparison is exactly what Gillard’s department facilitated for comparing school performance, because the number of variables involved in even one individual student’s test results is far greater than the overlooked variables in this parody of a simplistic supermarket comparison. MySchool, in providing chalk and cheese comparisons, has particularly highlighted socio-economic disparities between Australian schools.
So what might provide a valid assessment of “school performance”? The Rudd government’s response to this question betrayed some ideological biases and restricted assumptions about the nature and purpose of education itself. An unfortunate default position, in which schools are perceived as “fact-learning factories” has had a long and unfruitful history. The pendulum of educational ideology has regularly swung back towards the extremes of rational economics and education devolution.
It seems that the further the perspective has moved from classroom realities, the more the political lens reflects a myopically constrained view of the education of children as being something adequately expressed through their literacy and numeracy test results. It could instead have been assumed that “school performance” might be better measured by weighted multidimensional factors reflecting the health of a school’s organisational climate, such as older students’ ratings of teachers’ merits, teacher job satisfaction, and truly representative parent participation. Academic performance could be better estimated through assessing “school value added” performance, such as through measuring the degree of improvement in individual student’s results over time.
A vast array of other factors might also be measured to provide a better understanding of “school performance”. These might include, for instance the percentage of students learning to play a musical instrument from scratch, or continuing on to a trade or further study, or not attempting suicide before they graduate. Within such a rich conglomeration of achievements, statistics such as averaged individual student improvement over time in literacy and numeracy test results could be given meaningful context.
Given enough resources, time and expertise, we can validly measure almost anything, provided it is a clearly defined and quantifiable entity, irrespective of whether or not it is actually worth measuring. However, schools are neither factories nor supermarkets, and little Johnnie brings a vast array of complex variables to his test-taking performance on any particular day. Only one variable of the many is how well his successive (i.e. past and present) teachers have taught him literacy and numeracy skills.
Other variables include familial, cultural, societal, psychological, physical, nutritional, sensory and emotional variables impacting upon his academic and test-taking ability, but that prove much more difficult to measure, and are constantly changing. The best-performing schools therefore create and maintain an organisational climate in which teachers consistently work towards the achievement of each individual child’s highest potentials, with awareness of all these above factors informing every aspect of their pedagogy.
“School performance” thus defined would require far more complex assessment. Moreover, the best performing schools would inevitably include some that would fall well below the average on the NAPLAN tests. Conversely, by this definition, some of the worst performing schools, where teachers are “coasting” on the school’s reputation or those schools where competitiveness has prompted teachers to start “teaching to the tests”, would likely fall within the top MySchool rankings.
Perhaps the Rudd government should have gone to the top of the class for the sheer quantity of measurements pursued in the name of accountability and the rights of consumers to make informed choices. Unfortunately, the failure to listen to and consult with informed educators, combined with the failure of remedial class action by the teaching profession to be heard and understood by most parents, tabloid journalists and voters, has left a loathsome legacy from Gillard’s stint as federal education minister for all those concerned for “our education future”. I rest my case — MySchool remains a lemon.
Elizabeth Lyons is a psychologist who works with children, teenagers, teachers and parents in the NSW public school system.
Crikey is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while we review, but we’re working as fast as we can to keep the conversation rolling.
The Crikey comment section is members-only content. Please subscribe to leave a comment.
The Crikey comment section is members-only content. Please login to leave a comment.