assessment for learning

A few days back, Grant asked if I would follow up on my promise to write something on assessment. It would be great to get a discussion going around how & why we assess students, so after a bit of thought I decided to kick things off with the following post, derived from my own teaching portfolio document. (I rather feel that I need to be careful that too many of my posts don’t become Oracian in length! Not that there’s anything wrong with Orac’s posts! Quite the contrary.)

For all teachers, the $64-question is whether students are learning (and, whether they’re learning what we would wish them to learn!). Assessment is the usual tool for finding this out, although it may have unintended consequences when the nature of the assessment task shapes what and how the students learn. It took me a while to realise this – and it may be that many tertiary teachers still don’t realise this, perhaps because they are focused on teaching the content in a particular discipline rather than on the best methods for doing that. 

Students tend to focus on tests and final examinations, which are forms of ‘summative’ assessment; they give the assessor an indication of where the students are at, at the end-point of a program or part thereof. The downside of this is the situation where students use techniques such as rote learning to prepare for these assessments, without necessarily taking the information on board for the long term. This is exacerbated when lecturers ask questions that simply test recall rather than in-depth understanding. Far better to ask a mix of questions, with some that can be answered through recall of facts sitting alongside those that require comprehension, understanding, and critical thinking. Students who tend to use surface-learning approaches can attempt the recall-type questions. but the ‘deep’ questions encourage and reward deep-learning strategies. This mix of questions means that it’s possible to use summative assessment techniques to encourage a ‘desired’ style of learning and thinking, particularly if you let students know in advance the type of question that they can expect.

Now, if summative assessment gives you (& the students) a snapshot of where they’re at by the end of a paper, how can you use assessment to improve their learning along the way? By using a range of formative assessment strategies to build student capability, understanding, and confidence. 

Formative assessment takes many forms. The most obvious is probably written feedback on reports and essays – time-consuming to deliver, but far more useful to students than simply giving them a grade. UK educator Phil Race suggests giving feedback almost immediately and without a grade – because often the student will look at the grade and then pretty much ignore your carefully-crafted comments. Bridget & I try to do this with the essays our students write in first-year, by giving everyone some generic feedback on the issues that we know from experience will be very common. Then we don’t have to address all that individually & can focus on the specific areas with each essay that are good or in need of improvement. Having a good marking rubric – provided to the students along with the essay topics – is a big help with this. In fact, having that rubric also means (says Phil) that you can also get students to evaluate their own work. This may sound a bit counterintuitive but it’s a good way of encouraging them to reflect on the quality of what they’ve done.

Reviewing initial drafts can also help develop a range of process skills, although with a large class I doubt that teaching staff could actually look at them all! On the other hand, you can encourage students to give this sort of feedback to each other during tutorials; it’s a good learning experience for both the reporter & the reportee… Whatever way it’s done, while university assessment practices remain centred on written tests and exams, it’s really important to help students develop these skills. For example, extended essay-type answers are expected to show the writer’s understanding of key concepts and the ability to think critically about information from a range of sources. Yet science students fresh from the NCEA may not have these skills, because even ‘discuss’ questions require only relatively brief answers. So finding ways to provide meaningful formative feedback on essay assignments gives students valuable learning opportunities & also makes it more likely that they’ll develop the deep learning skills needed for real mastery of a subject.

I’ve written previously about other, in-class techniques that can provide students with immediate formative assessment on where they’re at with their understanding of a subject (here, and here, for example). Actually, the lecturer gets formative feedback too – if class responses to an item show a general lack of understanding on an issue, then that should be a pretty clear signal that I need to try a different approach 🙂 Over the years that I’ve been teaching I’ve increasingly incorporated some of these techniques, & one that both I & the students (judging from their comments eg "I really like the little quizzes in lectures, the conversations, and the freedom to ask questions") find useful is in-lecture pop quizzes. 

The way I use them, each quiz consists of one or a few questions that either examine students’ prior knowledge of a concept we’re going to discuss, or test their memory & understanding of concepts just covered. Students discuss their responses with each other & then I display the answers on screen & explain why I think a particular response is the correct one. (Quite often this will lead to further discussion.) There’s no pressure, no marks, but the class gets immediate feedback on where they’re at. Plus, the use of techniques like this can lead to greater student engagement and promote more active learning. 

As well as encouraging students to think more deeply and critically, teaching methods like this also help them to make connections between concepts and ideas, and with their existing knowledge framework. Sometimes this can be a bit uncomfortable, when you find that existing & new information simply don’t fit together & you have to do a bit of hard analysis of your viewpoint (the ‘troublesome knowledge’ that Michael Edmonds wrote about on Sciblogs NZ). And the evidence is there that learning to link concepts in this way does have a positive outcome for our students: while for ‘recall’ questions there was no difference between students who’d learned concept mapping & those who had not, for big-picture and interpretive questions there was a statistically significant improvement in pass rates for the concept-mapping group (Buntting et al., 2005). 

Of course, assessment is only part of a bigger picture. Whatever the assessment techniques you use, they have to fit within papers with a clear outline of their structure & content, so that students are aware from the start of the material they will be covering. (If you’ve read an earlier post on visualising a curriculum, you’ll know that this does come with a caveat.) They need to know how – and why – the course will be assessed. It’s also a good idea to spell out your expectations of the students, and what they can, in turn, expect from their lecturers. All these things work together to encourage students to develop an independent, deep-learning approach to their studies – & set them up for learning for life.

Next up – assessment & learning objectives…

C.Buntting, R.Coll & A.Campbell (2005) Using concept mapping to enhance conceptual understanding of diverse students in an introductory-level university biology course. Paper presented at the 36th annual conference of the Australasian Science Education Research Association

2 thoughts on “assessment for learning”

  • Alison Campbell says:

    Hi Jim – yes, I do know about this technique & I’ve used it in a small way with my first-year bio students. Probably need to get a bit more organised & expand on it, I think, as I really want to get them to reflect on the quality of their own writing & this seems a good tool for developing that skill (via thinking about the writing of their peers).

Leave a Reply

Your email address will not be published. Required fields are marked *