12 Jun

Audio record assignment feedback

audio-record-assignment-feedback

I dislike marking so much that sometimes I wonder if I can actually do this academic gig for the rest of my career. It’s hard to explain my dislike because I also like it at the same time… It’s great to see how students are doing, and I particularly like marking reflective assignments where my students discuss their learning, plus assignments where they’ve built something, like a website, or made a video.

But no matter how much I like marking particular assignments, the relentless grind and the tedium that sets in after the first handful really gets to me.

Part of the problem is that I give stacks of feedback and I can’t seem to stop myself from doing that, which means I am quite possibly the world’s slowest marker. I also find it really hard not to copy edit and write stacks of feedback on the assignment document itself.

But I have found a method for marking that is speedy for me and means the students get heaps of feedback: audio recording my comments.

A couple of years ago, I slipped two discs in my lower back right when all my marking for the semester came flooding in. By necessity, I had to come up with a way to mark while lying flat on my back, and necessity is, of course, the mother of invention! So I came up with a solution where I put criteria sheets in Dropbox and used Good Reader on my iPad to highlight criteria and write the mark on the sheet. Then I saved them back to Dropbox. At the same time, I recorded my feedback on my phone using Voice Record Pro. After I made each recording, I modified the filename to match my normal naming convention, and saved them to Dropbox directly form Voice Record Pro with a single tap on my phone. In this instance, I was marking videos on YouTube, so I just played those on my iPad mini while recording on my phone and annotating the criteria sheet on my iPad. And it worked well. In fact, the audio recording worked great (highlighting the criteria sheets in Good Reader was a bit of a pain, to be honest, but served the purpose).

I have used voice recorded feedback on and off since then, but I’m going to make a wholesale switch to recording all my feedback. The great thing about it is that students get more feedback but it takes me less time.

I highly recommend Voice Record Pro for iPhone. The quality of the recordings is really good and the ability to rename the files easily in-app and then tap to save them to Dropbox is super handy.

In case you’re interested in giving this a try, here is my workflow.

Workflow

  1. Open the assignment file and criteria sheet on your computer.
  2. In Voice Record Pro, tap the ‘REC’ button. Note this will not start the recording, but will take you to a settings screen.
  3. For the first recording, you will need to click on the ‘Advanced’ tab and choose ‘MP3’ as the format (the app records in MP4 format by default). I also recommend choosing ‘Medium’ for quality. These settings should stick for subsequent recordings.
    change-settings
  4. Hit the ‘REC’ button to start recording. I start with the student’s name and a statement like ‘This is feedback on Assignment 1 in IFN616. I’m going to record comments as I work through your assignment and finish with some summary comments.’
  5. Pause the recording and start working through the assignment. Restart the recording to make comments as you work through the assignment.
  6. When the recording is complete, tap the stop button.
  7. Change the filename to match your preferred naming convention – I name my files* like this: 2015 IAB260 Davis Kate IAB260-2.
    rename-1rename-2
  8. Save the file directly to Dropbox from the app, just by tapping ‘Save to Dropbox’ (note the new filename appears at the top of the screen).
    save-to-dropbox
  9. I highlight the criteria the student has achieved and write a grade on their criteria sheet. I use the same file name for the criteria sheet except I add ‘CRA’ at the end, so the file name is 2015 IAB260 Davis Kate IAB260-2 CRA.
  10. And then I do it all over again for the next one!

At the very end, I grab all of the audio files and criteria sheets and whack them into the same folder. Because I’m pedantic about file naming, everything files nicely, which speeds up the process of returning the assignments to students.

Just a note on file size: If the files are too big, you can import them into iTunes to compress them. You just need to change your import settings first.

(* Note I’m really pedantic about file naming for assignments because good file naming means less work when it comes to course accreditation because I can easily find the files I need. In particular I like to have the year and unit code at the beginning, preferably in square brackets but the app doesn’t support that, so I add them later.)

#blogjune 12/30

14 May

Assessing participation in an online learning community

A couple of days ago, I spoke at a PD evening for high school teachers on using technology, specifically about how I use social technologies for connected learning. There was some interest in how I assess students’ contributions to the learning community and I was asked if I would be willing to share the criteria I use for this.

I think it’s important to provide some context for what I do and why I do it this way, so I want to share a bit of background, too.

How I came to assess students’ contributions in online learning communities

In my first year of teaching, I had a real bee in my bonnet about online students’ engagement – with the learning materials, with me, and with their peers. So when I had the opportunity to design a new unit that year, I built the assessment around engagement, in addition to the learning outcomes. It helped that in this particular unit, students were learning about personal learning networks. This meant I had both pedagogical and content-driven motivation to get them to talk to each other, to me and to professionals in industry.

I had a hunch that I would need to assess student contributions in the online learning community in order to successfully foster engagement. And as it turned out, that year, I learned that by assessing student contributions in an online learning community, I could draw them into participating in a meaningful way. Some colleagues and I did a bit of a project about online peer engagement and interviewed students about their experience in the learning community in this unit. We also interviewed students from an accounting unit and an education unit about their online peer engagement. We looked at three different online peer engagement activities across the three units: in the accounting unit, we looked at the use of a discussion forum; in the education unit, we looked at Collaborate-based online tutorials; and in my unit, we looked at the whole approach to the unit, which was essentially what we might now call connected learning (but we didn’t have a name for it at that point – or at least, I didn’t have a name for it). In a nutshell, I ran a WordPress Multisite installation where the students each had their own blog, and used BuddyPress to turn the site into a social network. This approach was inspired by Michael Stephens’ and Kyle Jones’ work with WordPress and BuddyPress with a course at San Jose State University. I also assessed students’ participation in the learning community. Their participation involved commenting on my and their peers’ blog posts, having discussions on the site using the social networking functionality, and engaging on Twitter through conversation and using the class hashtag to share interesting material related to the unit.

In the interviews, my students reported they started contributing to the learning community because they were required to for assessment, but they kept on contributing because they found it valuable. Assessment drew them in, but they kept contributing because it was beneficial. And that’s what it’s all about, right? Creating experiences that are valuable to students, not just experiences they have to engage in to satisfy the course requirements.

But how do you assess contribution to the learning community?

The answer is: it depends on what you’re looking for.

I was looking (and continue to look) for meaningful engagement with the content and robust discussion and conversation. I wanted students to form a personal learning network, both within the class, and beyond. I didn’t just want them to connect with each other, but also with industry. I wanted them to feel empowered to learn by playing with technology and critically reflecting on their practice. I wanted them to support each other, to demonstrate leadership, to role model playfulness and personal investment in ongoing continuing professional development.

In that first year, I developed assessment criteria for participation in the learning community that were fairly basic. I also negotiated the assessment criteria with students (which I do in most of my units), so they had the opportunity to tell me what they thought participation should look like. The next year, after conversations with a colleague who does something similar, I added in criteria about demonstrating leadership. And the criteria have continued to evolve each year.

So what exactly do I assess?

When I grade students on participation, I’m looking at

  • analysis and critical discussion
  • leadership
  • extent of contribution.

I don’t know about you, but I really dislike developing criteria sheets (aka rubrics). I dislike it because I invariably come to use them and find flaws that make it really difficult to mark against them. I also find it tricky sometimes to find the right words to convey what I’m looking for. So I really labour over these things and at the end of each semester, I make notes about what worked and didn’t work, and I iterate them from semester to semester. I also trawl the web and academic databases to look for rubrics or criteria sheets that might serve as inspiration.

At my institution, we use criterion referenced assessment. My personal approach is to write really detailed criteria sheets so that I am completely transparent about what I’m looking for. I spend time explaining the criteria to students (although I aim to write criteria that don’t need explanation), and then I give them opportunity to think about the criteria and negotiate them with me. (As an aside, I find negotiating criteria a really powerful tool to help students take charge of their learning.)

Criteria sheet

Download my criteria sheet for assessing student contribution in online learning communities [Word]. It is by no means perfect, but you might find some of the words useful.

Over the years, I’ve been inspired by rubrics shared by other teachers – primary, secondary and tertiary teachers. I can’t remember all of the rubrics I’ve drawn from but this year, I found two in particular very helpful, and I’ve borrowed words from both. They are both from iRubric:

The practicalities

While I track the volume of students’ posts in our various online spaces, I am more interested in what they contribute, rather than how much they contribute (although the latter is important too). This means I need to look at students’ contributions, not just statistics, to grade them. This is not easy and it’s not quick, but I believe it’s worth the effort. When I use a WordPress blog network where students have their own blogs, in combination with BuddyPress, it’s relatively easy for me to go back and see what students have contributed. This semester, my students’ blogs are all over the web, so it’s a bit trickier to track their contributions.  I have a multi pronged approach this semester, but the main thing I’ve had to think about is how to capture students’ comments on their peers’ blogs in a way I can work with to grade them. I’m doing this by aggregating the comment feeds from my students’ blogs into a single RSS feed, which I’m scraping into a Google Spreadsheet.

If This Then That recipe for scraping an RSS feed into a Google Spreadsheet.

If This Then That recipe for scraping an RSS feed into a Google Spreadsheet.

A colleague of mine is manually tracking contributions this semester by tallying them in a spreadsheet. She is much more disciplined than me though! I wish I could commit to that level of tracking, but the reality is I’d fall behind and I’d invariably give up. So automating works for me!

I’d love to hear your ideas for tracking student engagement in learning communities, and any thoughts you have on assessing contributions effectively. Please share in the comments!