Private Schools Melbourne, Sydney and Perth – Regent Consulting

A head of a leading independent school told me some time ago that the yearly V.C.E results have a shelf life of about six weeks. It was an interesting observation with which I didn’t entirely agree, but I knew what he meant. Several years on from that comment and the analysis, discussion and the marketing importance of the results have increased significantly. The media (in particular ‘The Age’ newspaper) devotes significant resources to the analysis of the results, and several schools employ PR experts to act on their behalf to manage the publicity surrounding the results. Many of our clients are very interested in a school’s academic performance and, rightly or wrongly, both the V.C.E results and NAPLAN performance of a school are closely scrutinised.

Most of the private schools are acutely aware of the impact results can have on enrolment and their brand, and you only have to have a cursory look at their websites to see the importance many of them place on these results. I actually believe the results now raise as a many questions as they answer, given all the variables that need to be taken into account when trying to understand them.

Probably the biggest anomaly is the IB schools being lumped in with the exclusively V.C.E schools, when the V.C.E results come out in mid December (the IB result aren’t released until early Jan). It must be remembered that some schools such as Wesley and Geelong Grammar have approximately 25-30 % of their cohort undertaking the IB. This will obviously affect the outcomes they achieve in V.C.E.  Given the requirement of the IB that students undertake a subject from each area (ie Language, Maths) and the difficulty of the course, it is pretty much exclusively the domain of the “more capable” students. So these schools (and others) have their performance published, discussed and compared to schools who don’t participate in the IB and therefore have all their capable students included in their results. By the time the IB results are published (first week of Jan) the whole picture is blurred and the moment had passed, so to speak, with a lot of people out of the country on holidays. It is like looking at the batting average of your team with your 4 best batsmen not included.

In addition to the IB anomaly, there is the usual hysteria surrounding the results achieved by the selective government schools, namely MacRobertson Girls High and Melbourne High. These schools are totally selective entry and start at Year 9 (ie other than a very small percentage who can gain entry at the discretion of the Principal, each student has to sit an examination and achieve a certain score to gain entrance). So let me get this right. Every year the media and these schools themselves, publicly and loudly trumpet the outcomes they achieve in V.C.E. and, with at least 95% of the students handpicked from an entrance examination, we read article after article about their superior academic performance.

Well, I would argue that a school where virtually every kid is handpicked due to their academic prowess, should perform far better than any other school.  Many private schools are on a par with some of their outcomes, despite only two private schools in Melbourne having entrance tests. I could mount a case that, in some respects, the results of these selective government schools are disappointing given the nature of the cohort. I just watch in amazement each year as the media celebrates the fact that super smart kids, handpicked and screened, perform well academically. Oh, what I would have given in my teaching days to have every kid in the class basically on a tax-payer funded scholarship!

Another anomaly is that the % scores over 40 that are published in extensive tables are unscaled scores. Gaining a score of 40 or above in a particular subject equates to being in the top 8% of that subject and one of the major benchmarks used  when analysing the performances of schools. Harder subjects, however, are “marked up” (some considerably) so that a raw score of say 38 in Latin, wouldn’t be included in a school’s over 40 scores published in the media. When scaled up, it equates to a score close to 50 ( incredibly) such is the weighting given to it, whereas a score of say 41 in Business Management is scaled down to 38, yet is included, as the unscaled score is greater than 40. So effectively the schools that have more students doing the difficult subjects, have less raw scores over 40 but it isn’t a true indicator of their academic performance.

So are the results of any value? They can very useful for identifying patterns of performance and it can be argued that the results do give a snapshot of the academic climate of a school, particularly at the senior end. I read them with caution due to the anomalies I have outlined and my strong belief that the “worth” of a school is far more than the statistical outcomes achieved by their Year 12 cohort. I worry about what the pressure to achieve those outcome can do to learning environments and the students who don’t “add value” to those statistical outcomes. A teacher told me last week that as the exams approached near the end of Year 12, she would devote her scarce time to those kids just around the 40 mark, as the more she got over 40 the better for her as that was how she was going to be judged. Too bad if your kid was in that class and on 32!