Why Assessment is Stupid - Cranbrook School

Why Assessment is Stupid

10 Oct 17 by Nick Carter, Director of Teaching and Learning

Last week I went to a two-day conference where all we talked about was student assessment.

If you’re not in the teaching business, I forgive you if your eyes just glazed over. To tell you the truth, when I get home after a tough week and flop onto the couch, the last book I’m likely to pull off the shelf is Authentic Assessment: A Collection, edited by Kay Burke Ph.D.

That’s a real book, by the way. Here’s a picture. I dare you to put it in a family member’s stocking at Christmas.

I’m only pretending to be cynical. We tend to talk about assessment quite a lot at Cranbrook – I seriously have a folder on my desk full of Cranbrook staff members’ formative assessment strategies (this job appeals to a select few). If you’ve forgotten, formative assessment is assessment that helps kids actually learn stuff (as opposed to measuring whether they did after it’s too late).

However, student assessment is something we often take for granted, perhaps because much of it is shaped by external forces like NESA, and many schools – despite their rhetoric about being future-focused – are pretty conservative in this regard. I heard a horror story over the weekend about another school that wanted its students to experience an Open-Ended Investigation in Science, so they sent the kids home with an old exam paper to complete. Seriously.

This conference reminded me of something, though, that I think I always knew, but I’d never allowed myself to say out loud.

Every decent teacher knows what good assessment looks like.

But…

Every decent teacher knows the HSC Examinations look nothing like it.

I promise I’m not trying to get fired.

Think about it: good assessments have certain qualities in common: they let kids grow, work together, engage with the real world and use errors purposefully. Now invert all these qualities. Are you seeing an HSC Exam hall, or is it just me?

Beyond this, there were three ideas from the conference that really challenged me. Some of this stuff is a bit controversial, so forgive me… I’m trying to be provocative rather than offend anyone.

  1. A lot of us – students especially - talk about the importance of ‘playing the game’ when it comes to the HSC. Skilling and drilling is one of the paths to success, right? I’ve lost count of how many times I’ve said “The more practice essays you write, the better you’ll do.” Almost as an afterthought, we then remind students that they are expected to experience one or two moments of profound truth and beauty over the course of the year. Surely – surely – we are obliged to flip this around. The research supports such a move. In Paul Ayres’ influential 2004 study of exceptional teaching, he found the teachers who got the best results out of students across the board tended not to teach to the exam at all. They focused much more on a passionate engagement with the ‘big picture.’ I think we end up on the right side of the dynamic here at Cranbrook. As a test, ask your child the following question tonight: “Would this stuff be worth learning if you weren’t going to be tested on it?” I’d love to hear your responses.
  2. John Hattie (arguably the most influential educational researcher alive) published an article in The Australian in June, where he said that the reason Australia wasn’t performing as well as other countries in the international PISA rankings was, essentially, because of schools like Cranbrook. For Hattie, the institutions that tend to attract the top-performing 40% of students (like fancy independent schools, he ventured) were letting the country down by allowing these kids to “cruise” rather than be continually challenged with higher and higher expectations. There’s much more to his argument, of course, and I’m paraphrasing. Nevertheless, is this true? Could anyone look at a Cranbrook student and accuse them of “cruising?” Could anyone look at Cranbrook and suggest it doesn’t expect a great deal of its boys? As I say, challenging stuff.
  3. Dr Jared Horvath stated – very passionately – that grade-based assessment began as a tool used by the military and, later, by the business world to organise personnel. He argued that this impulse to reify, quantify and rank ineffable things like intelligence was adopted unwisely by the education sector. He reckoned we shouldn’t trust education systems that operate solely to serve the job market, because real learning has nothing to do with keeping the wheels of capitalism spinning. Real education should seek only to blow the mind of the learner. I’m taking a little creative license with that last bit. What do you think? Is Dr Horvath some raving, communist-leaning, ivory-towered academic, or is he onto something?

As always, please let me know what you think at ncarter@cranbrook.nsw.edu.au