As a prior classroom teacher and researcher in the Curriculum, Assessment & Pedagogy Research Group at the University of Glasgow, assessment never strays far from my field of consideration. Though few would argue that feedback is a critical aspect of effective assessment, it takes time. Certainly, as an academic, I am still grappling with how I best manage my work and, in doing so, have recognised a number of quite inefficient processes. In this series of posts, I will describe some of the major changes I have made to my practice as a result of trying to increase the quality of the feedback I give to students whilst simultaneously slashing the amount of time it takes me to administer. With the compilation of feedback being one of the most time-demanding aspects, this formed my core focus for channelling what resourcefulness and creative I could muster. To assist in this, I embraced the notion of technology enhanced feedback (and I will resist the temptation that Education seems to have of creating yet another acronym…).
This post is the first of three that will case study specific methods that you can employ, develop and improve to (hopefully) help you and your students. The first of these, explored here, is a system for automated exam feedback in response to a recent policy change within my own institution (my Sunday morning hobby for a couple of weeks!). The second looks at the use of podcasting as a form of feedback, and the final post looks at how to set-up and utilise video and scree-cast feedback. Following notable time saving on my part, and extremely positive feedback from students, it is my own intention to collectively embed these across all of my courses this coming session. Where possible, I will make files available for people to download who may not be as familiar with some of the more technical elements.
CASE 1: Automated Exam Feedback
Historically, and in almost all cases I can recall, exam feedback has amounted to a letter grade but a recent change to policy in my own institution seeks to provide students with a greater level of information about how the exam went. In addition to the marking and verification processes themselves, this could provide to be quite a demanding task – especially if large numbers of students are involved. Though it may be possible to satisfy this through a generic overview or snapshot sheet for overall performance of the cohort which could be forwarded to each student, it would clearly be desirable to have a level of personalisation. The task here then, is to create a scalable and cost-effective system that allows individual exam feedback to be sent to each student. The solution was a system which had no monetary cost or capital outlay, could handle a large number of students, compiled individual feedback in less than a second and is capable of automatically sending this feedback to each student without any human intervention.
To ensure no additional cost, the system capitalises on the underlying power of Microsoft excel. The office suit as a while allows a range of scripting and automation which few users even begin to take advantage of. A standard spreadsheet was set up listing one student per row, and each section or exam question as a successive column. As scripts are marked and totalled by question anyway, completion of such a spreadsheet up to this point, would constitute existing practice.
Next, a button was added and a vbscript developed to retrieve results from each column, format and output them as a separate file for each student. The system is ‘scalable’ insofar as it automatically detects how many students have been included and how many questions results have been entered for. Additional cells on the forms of the spreadsheet allowed contextual and descriptive information to be added which was collated as part of the file creation process.
As can be seen below, the feedback information in each of the files gives students specific data about how they performed on each question or topic area in relation to the number of marks available and to the class as a whole.
Lastly, an additional section of script was added which called on the Microsoft Mail Object which resides somewhere inside the office suit. This takes each of the feedback files and automatically creates an email for the student the feedback file is for. It completes the subject line and body text explaining that this is the results of their exam, adds the feedback file as an attachment, and emails it to the students email address using Outlook. This is done for each student and, in testing, was able to send out 30 results emails in a fraction of a second.
The beauty of this system is that it is personalised to each student, incurs no additional cost to the institution, and provides feedback almost instantaneously by clicking just one button. My intention is to make a more generic version of the excel file available for download over the next few weeks that can be used with your own courses.
The next post in this series will explore ways of creating and using podcasts for the purposes of student feedback.