Assessment and Feedback

Day 4: Tools and technology for peer assessment

Today we will be looking at some of the technologies and tools that can be used to help in facilitating peer assessment. By its nature, peer assessment and feedback activities and tasks can be very time consuming and complicated to manage and track, especially if you have a large number of students, so the use ofĀ online tools to assist allows you to streamline the process. Many of these technologies come with built in structures to move students through a number of steps or processes in which they can review and provide feedback to their colleagues work.Ā 

Puzzle pieces fitting together

Within Moodle (Wattle), which is the LMS platform used here at ANU as well as many other schools and universities, there are a number of tools that can be used in the peer assessment process. These include the Workshop Activity, discussion forums, and the database tool. Today we will be having a look at some of these tools as well as others that are available to see how they can be used for peer assessment and feedback activities.

Activity

Ā Imagine you are running a peer assessment activity and had no technology you could use. What types of challenges could you see running it? How would you solve these issues?

The Workshop tool in Moodle

The Workshop tool within Moodle is one of the tools that Ā has been specifically designed to cope with the ‘mechanics’ of the peer assessment process. It can be complicated to use and requires a lot of care and consideration when setting up the activity. This is not an activity that can be set up the day or day before the lecture. But when set up effectively, it can be beneficial to peer assessment and provides students with a rich experience in giving and receiving feedback and in learning from each other to provide a deeper learning experience.

In the following video Professor Julian Cox from University of NSW discusses the benefits in using of the Moodle Workshop Activity for peer assessment and review tasks (please watch the 3rd video 1.25 minutes).Ā https://teaching.unsw.edu.au/peer-review

The following article from Elon University,Ā Dr. Eric Bauer uses Moodle to simplify the peer review process’Ā , provides a short case study for using the workshop tool.

Below we will look at some of the other tools within Moodle that can also be used for peer assessment and review tasks.

Forums

The following video shows you how you can use a Moodle Forum activity for Peer assessment.

Database

The Database activity in Moodle enables students to add and share content with their classmates such as web links, links to books and journal articles and student created content such as photos, posters, websites links to YouTube videos that they have created. Students can vote, make comments and review each others work, and the database is searchable. Students can also be placed in groups to review each others work within the group or review other groups work.

The following site provides more detailed information on use the Moodle Database tool for peer review activities.

Wiki

A group of giraffes together

Wikis facilitate group work and collaborative learning among students. Students are able to contribute to the wiki and then edit each others contributions, and these edits are can be tracked by teaching staff. For peer assessment activities, this could be a useful space for students to submit work and then allow other students to review and make comments on it, dividing students into groups for the activity and allowing them to review and provide feedback to their colleagues in their groups.

Here is some more informationĀ on using the Moodle Wiki tool.

Other tools:

Outside Moodle there are many other technologies and tools that can be used for peer assessment and review tasks. Below are a few of these which you might like to investigate:

SPARKPlus:Ā A peer assessment resource kit developed by UTS which allow students to rate their peers anonymously. The following article examines the use of SPARK in self and peer assessment, ‘Improving Self and Peer Assessment Processes with Technology’.

Voicethread: A collaborative tool in which students can share multimedia and video content and comment on what others have uploaded. A cloud based technology that allows you to create interactive lectures in which students can collaborate and interact with, comment on content. Can also be used as a collaborative tool in which students can share multimedia and video content and comment on what others have uploaded. Allows content creation, sharing annotating and comment moderation.

The following article provides a case study of how Voicethread can be used: VoiceThread: Enabling Peer feedback in first year computer engineering’

ePortfolio systems: Such as Mahara (available at ANU) and PebblepadĀ allows students to share work with each other or in groups to review.

There are many others, including WebPA, iPeer, Peerwise, PeerGrade, and so on.

Activity

Ā Please share in the forum your experience of any peer assessment or review tools you may have used, with students or as a student, and tell us how you found them. Was it useful to you? What worked well (or not) about that tool? What advice would you give to someone else using them for the first time?

Other resources

University of Bath – Learning technology – what’s out there for peer assessment?

Links to more articles about the Moodle Workshop activity:

‘An even better peer feedback experience with the Moodle Workshop activity’Ā 

‘A good peer review experience with Moodle Workshop’

22 thoughts on “Day 4: Tools and technology for peer assessment

  1. If I had no technology I would not be able to communicate with the students and so would not be able to teach, let alone run peer assessment. Hand signals, verbal and written languages are forms of technology. Computers, the Internet and mobile devices are just fancier forms of hand-waving. šŸ˜‰

    One technology I introduced for large group teaching in ANU’s temporary flat floor classrooms is a whistle. The 300 students were doing group exercises and became so engaged they did not hear the instructor. So I purchased a whistle, which got their attention. I have considered having the teaching team in color coded vests, matching their role, with matching colored whistles. That may sound silly, but it really is hard to keep a large class moving on group tasks.

    Currently ANU TechLauncher uses a bespoke online feedback system. Each student uses their mobile device to score teams on multiple scales and provide text feedback. But this takes time and the learning suffers from the gap between stimulus and response.

    I have considered having the audience holds up cards with numbers on them, like a ballroom dancing competition. This is inspired by the innovation pitch events I attend, which run through a lot of presentations very quickly. Speakers usually have only sixty seconds to three minutes. With ANU TechLauncher the presentations have been shortened to 15 minutes, including five minutes for peer feedback. I think we could reduce this further, perhaps to five minutes, while increasing the quality of the learning experience.

    1. Thanks for your thoughts Tom. The idea of the whistle to manage students and keep them moving forward on tasks is a good one. I think your statement about not being able to teach without technology and the idea of managing peer assessment tasks without technology is interesting. Using technology to manage student assignments, including peer review ones, is a fairly recent activity so I guess the challenge is how were these activities run prior to technologies introduction? What methods/ strategies were used by teachers in the ‘old days’ where largely paper and pen were the technology at hand?

  2. I have used the Moodle forum for peer assessment for the last two years. Each week I ask the students two or three questions. They have to first answer each question and then reply to post from at least one other student about each question. The students grade each other 0, 1 or 2 on each post. They can’t see who graded them.

    At the end of the week I have the Moodle grade book sort the average grade for each student in ascending order. I then look at the posts for each student and adjust the grade if necessary. Then I write a private note to each student with their grade and feedback. Usually I only send feedback to those who are below 1.

    This works well, as the students grade much as I would, so adjustments are rare. Also students very rarely complain about the grades, as they feel they were part of the process. It is quick to do. What works very well is that I can see who is having problems as they got a low grade.

    The first time I was not confident, so I did the peer assessment for formative purposes. That is I did’t make it part of the final grade. It helped with confidence to enroll in a course which used peer assessment to see how it was from the student’s point of view.

    Introducing the peer assessment early in the course with a low stakes task also helped. I have it starting from week 1, every week for 12 weeks, 1% per week, with the best 10 weeks counting to the final grade.

    Next year I am considering one-on-one peer feedback for the smaller assignments and perhaps peer assessment of those (2 x 10%) the following year.

  3. Tom, I have seen the coloured vests in use in UQ! (Engineering, of course!) A little like the ballroom competition, I have distributed red, yellow, and green cards to engineering students in large flat-floor settings, to visually signal how well they feel they understand a presentation. A prerequisite is broad acceptance that “seeing red” is just a cue, permitting a speaker decision to change the presentation, not a reflection on the acceptance or value of the person. I feel that inviting, accepting, and adapting to feedback has to be modelled by a teacher before it will seem safe for students.

  4. I found today’s post inspiring but also a bit overwhelming – never knew there were so many possibilities out there! When I was doing my undergraduate degree, there was very little technology involved (showing my age here), and I can see that what is available to us now can enhance learning in myriad different ways that just weren’t possible back then. Also great to read about the experience that others have had (thanks Tom) and as a peer assessment novice, looking forward to reading more. When I do use PA for the first time, having seen what’s available, I imagine I will start with the type of activity that I want to use for PA and then look for the form of technology that allows me to best do that. I’ll start small and see how it goes until I develop confidence in my ability to master the technologies – otherwise I suspect the confidence of students in the process would be undermined.

    1. Hi Sally. Starting small and easing students, and yourself, into PA tasks is a great strategy. It can be easy for these activities to become large and unmanageable and very overwhelming for you and the students. Matching the technology to the task and making sure both sides are comfortable in using that will build confidence in the students in doing these type of activities.

  5. I have only used PA as a student and as part of a MOOC. There, it was integrated into the online system (in this case, Coursera) and it worked quite well. For the example I am thinking of, we had to rate other students’ work according to a rubric. As I’ve mentioned in previous days, the rubric was quite strict and clear, so it would have been hard for students to accidentally mark low. We also had the option to provide justification statements. I think we all had to mark at least 3 assignments, and I guess the final student mark was the average of the 3.

    The main downside I can really see for this was that there were a lot of students in the class and so, if you did want to refute a grade (e.g., due to unfair, incorrect, or absent justifications), then I imagine this would have been a rather drawn-out process of trying to contact the course coordinator. I’m also not really sure that the process was treated as a ‘learning experience’ for the students; rather, it felt more like an offloading of marking duties. So, it would have been better to perhaps have included some educational material around the benefits of PA, and some guidelines for how to do PA well. Especially given the fact that MOOCs don’t necessarily require any prerequisites, meaning the student skill-set might be quite variable.

  6. Hi Angela. I think your comments in the PA process from a student perspective is important and clearly explaining this to students is a key component for success, otherwise it can come across as students doing the lecturers job, although in the case of a MOOC with thousands of participants this is often a necessary logistical strategy.

    PA can provide a great opportunity for students to see how others have tackled an assignment and give a different viewpoint or perspective. I guess one of the difficulties that can be encountered is that often students dont always value their colleagues input and want feedback from an expert.

  7. Other than pen and paper, I haven’t experienced technology in peer assessment. This has been fine for smaller classes, but I can see the benefits of all of the tools mentioned above in larger settings. Like Sally, I look forward to being able to pick a tool that fits with my chosen peer review/assessment method one day.

    1. Hi Bhavani and Sally. I look forward to hearing how your first experiences of using technology for PA goes. Please come back and post and tell us in the future. šŸ™‚

  8. Last year was the first time I attempted to use online technology for peer feedback. I wanted to use the Wattle wiki tool first, but it was a bit unwieldy in its set-up and I realized that it’s much better for collaborative work rather than for peer assessment. In the end, I used the forum tool, where everyone had to post their essay on Day 1. On Day 2 everybody had to claim another student’s essay for peer review (to make sure that everybody gets feedback). By Day 7 they had to provide written feedback. I think it worked quite well, and one of the reasons is that the students know that the instructor can see the comments, so there is quite a bit of accountability in comparison to a pen-and-paper peer review done in class, that is not seen by the instructor.

  9. There are some courses in CBE with several hundreds of, even a thousand students. I could not imagine how to carry out peer assessment in such a big class without technology. The formats are limited to pen and paper, or presentations in the classroom. It could be very difficult to distribute the scripts all over so many people. For presentations, it’s hard to choose the right size of the groups. Maybe we can carry out the peer assessment in tutorials, with pre-trained tutors. But I was still wondering whether the peer assessment is proper for such a large class.

    1. Hi Sunny! This might be something to consider for others in when to use peer assessment – and when not! I suspect that there could be more significant logistical issues when using this method with large classes – especially the classes sizes you are talking about in CBE. Itā€™s probably something that would need to have some special training and support to do.

      Iā€™d love to hear from some other people about their experiences using peer assessment with more than 100 students.

  10. I’ve used wiki for peer and teacher review but it is messy. Unlike forums the comments appear immediately so that good if the activity is in real time. Forum takes 30 minutes for responses to appear. I agree a forum provides a participant record/accountability which is quite handy if that is a component of the assessment/activity. Sometimes I wonder if good old email isn’t the best for review work because people get alerted at the time and the discussion can stay fresh. It’s actually what I tend to do when asking for feedback/giving feedback in my own collaborative work. When I’ve done wiki activities they tend to die straight after class because there are no alerts so I’ve had to email students to tell them to look at my responses on the wiki.

    I’m going to look into the workshop tool but if it requires a lot of forward planning, it might not be the tool for me:)

    1. Hi Suzy, I appreciated hearing about your experiences. Itā€™s interesting because often complex activities like peer assessment work best with a technology that is going to be familiar, easy to use, and supporting the activity – so if email works, email is best! I also love the idea of the wiki but have not had the easiest experiences with it.

  11. I have been most actively involved in supporting the use of the Workshop tool and Forums for peer review. Iā€™ve also supported the Wiki tool, but it was (as someone has mentioned above) quite messy to use and I prefer the Workshop or a Forum.

    I love the fact that Forums allow students to carry discussions that interest them, and so there is scope for the development of ideas past the bounds of the task ā€“ itā€™s more organic. In saying that, it can sometimes be a little difficult to isolate discussions when marking, so I would say thatā€™s once complication when using it for assessment purposes.

    I find that the Workshop tool really needs to be explained clearly to both students and markers using it. Itā€™s more complex than (most) other Moodle tools, but if itā€™s set up properly and students are given clear instructions, I think it can be really powerful. Students can give feedback and a mark for one or more of their peers, which exposes them to the ideas and writing styles of others, and gives them a deeper understanding of the marking and feedback process. When used as a feed-forward activity, it gives students a chance to actively respond to targeted feedback (and therefore a chance to improve their skills and/or knowledge and improve their grade).
    I would recommend that if youā€™re a lecturer wanting to use Workshop for the first time you talk to your relevant Wattle administrator (Ed Support or Ed Designer) to get a handle on the phases of the tool and develop some clear instructions for students.

    1. Hi Rebekka. It’s great to hear how you have been using the Workshop tool and your experiences in that. Thank you for your advice and tips on using it.

  12. I always like to start with a no/minimal technology base when it comes to Peer Assessment. So I still do peer assessment for presentations where peers simply fill out an annonymous feedback sheet using pen and paper or a report based assessment where students simply fill out a checklist using pen and paper.

    I think could teaching and assessment doesn’t need usually need technology. In most cases, technology may speed things up or make things a little easier, but as it has been said many times in these coffee courses, we shouldn’t use technology for technology sake – it must enhance and facilitate good practices

  13. In the course I have taught with peer assessment, currently it does not use any computer-assisted technology. The assessment task is a group presentation, students watching are given rubrics with marking criteria, and space for comments, and they fill them in during and after the presentation. The teacher then collects these (anonymous) rubrics, and manually averages the scores and enters the final grade into Moodle. The presenters are able to get the comments and grades from the other students, via the lecturer (providing there are no offensive or rude comments). The biggest challenge with this method is collating and assigning grades to the students, considering there are approximately 15 peer feedback rubrics per presentation.

    I donā€™t think Iā€™ve used any peer feedback technology, either as a student or as a teacher, like the ones listed. I have heard about the workshop tool in Moodle before, but I have not yet had any occasion to use it. It sounds like the ideal tool for peer assessment on written work. I have used wikis for other projects before though, outside of a HigherEd context, and personally have found them to be complicated to use and difficult to navigate, although this may have been because of the people who set them up. However, making sure that any technologyā€”especially if it is likely to be new to students or teachersā€”is clear and easy to use is essential.

  14. Awesome Rebekka.
    It’s very helpful the way you have pen your experiences and your expertise on using workshop tool.
    Thanks for sharing your advice and tips .

  15. At the ANU I have seen academics use Wattleā€™s wiki, forums and workshop tool for peer assessment. As a student, Iā€™ve used the wiki and discussion forum.

    Moodle/Wattleā€™s workshop tool takes a lot time to set-up and has a high learning curve. But itā€™s designed to handle and support peer assessment. It works well because it has a workflow called phases so the entire activity moves forward in phases. In every phase, there is a list of tasks to be completed which is very helpful to the user. If you are using this tool for the first time, make sure you have ample time to test and explore the settings.

    The wiki and standard forum are simpler to set-up but do not give the same structure as the workshop tool. The wiki is a space where anyone can edit and it is important that a template is set-up from the beginning. Otherwise people will stick everything, anywhere. Make sure instructions or important information are somewhere that is not editable or people might delete things. It is possible to recover items because the different versions are just there but it can get messy. The forum is also simple but very linear and in the case of Wattle, if a thread develops, it can be hard to search.

    If you are using any of the above tools, make sure you have time to test and to consult or work with someone who has used it.

  16. My super limited experience of peer assessments, other than the voting system in my last post, was a non-starter. For an undergraduate course in semester 1, 2017, students in tutorials were tasked with presenting their synopsis of a reading to the class, with another student tasked with assessing them against a rubric. The students doing the assessment resorted to ticking boxes in the rubric, lost the benefit of being able to concentrate on the presentations themselves, and did not provide any useful feedback (that I could tell). This approach was bunked after week 3. I might perhaps introduce a different distinction, rather than talking about hi fi/low fi assessment tools, I would distinguish between in real time assessment and after the fact assessment, eg evaluating someone as they present vs evaluating a presentation (eg recorded) separately. As anyone with committee experience would know, taking notes and actively participating in a meeting is almost impossible. So for being able to assess and evaluate after the fact will make it easier for the assessor and will also provide space foe more meaningful and thoughtful feedback.

Leave a Reply

Your email address will not be published. Required fields are marked *

*