My first go with Votapedia

Yesterday I had my first attempt at using the Votapedia audience response system.  For those that don’t know it, it’s a web-based thing that’s come out of CSIRO in Australia, and in the broadest of broad terms it gives a teacher (me) the facility to do a Who Wants to be a Millionaire? ask-the-audience question. Last year, I gave all my students cards (actually, bits of paper) with A, B, C, and D on them, so when I ask a question they can vote on it.  I use this as part of formative assessment in each lecture – first so that I know whether my students have understood the point I was trying to make, and secondly so that the student knows whether they’ve grasped it or whether they need to do some work on it, and thirdly as a tool for allowing peer discussion.

Ideally, I’d use audience clickers, (as in WWTBAM) but they haven’t yet made an appearance at Waikato, so Votapedia is a cheap imitation.  Here, the user (me) sets up an account on Votapedia (www.urvoting.com) and enters his or her question into the system through the web.  When ready, the question is ‘activated’, and the students get to see the question, possible responses, and phone numbers to ring to vote. The votes are tallyed automatically and displayed nicely on a bar graph.

Sounds easy?  Well, here are my experinces. I’m not the only person to have blogged about this, e.g. see https://davidtjones.wordpress.com/category/ilecture/votapedia/ .

First, registering wasn’t totally straightforward, and I had to call in our IT people to advise, but got there in the end. Once I was a bona-fide user, setting up the questions was a piece-of-cake. However, in the lecture room yesterday morning it all got a bit muddled.  First, the room happens to be in the basement of the world’s most ugly (now that the Tricorn centre in Portsmouth has been demolished) reinforced concrete building. About quarter of the class couldn’t get mobile phone reception. Then, it seemed that responses were very slow to come in.  From someone voting, it was many, many seconds before the response was noted by the system.  People were confused about whether their vote had got through (the call ‘drops’) and a voter hears a number not obtainable tone.

But, perhaps most significantly, only about a third the class actually voted, whereas with the A, B, C, D cards it was close to 100%. Of course, some of them couldn’t get reception, some of them might not have had the cellphone with them (what, students not have their phone glued to their hand?), and some might not have believed me when I said it wouldn’t cost them anything. Perhaps some just couldn’t be bothered – given that I had no real way of knowing whether they were voting or not. To get formative assessment to work properly, I need a decent chunk of the class to vote.

I’m going to give it another go, and then as a class we’ll make a decision on whether to stick at it or go back to the cards. From my point of view, if only a third of the class votes, for whatever reason, it won’t be worth the fuss.

Leave a Reply