Assessment and Feedback

Day 3: Case studies and examples of authentic assessment

Image of person looking through binoculars by Free-Photos from Pixabay

It’s always valuable to take a peek at what others are doing, whether to take inspiration or affirm you are on the right track. Today we will look at examples of authentic assessments, provide additional resources for further investigation, and invite you to share some of your ideas and experiences. You may wish to choose 1 or 2 of the following to explore and discuss today and if you like, come back later and peruse the rest.

Case studies and examples

Work Integrated Learning – Pharmacy Course 

Santos and Manuela (2017) review an authentic assessment for students of the Queensland University of Technology. Students were engaged in authentic learning experiences and assessments framed by particular models such as the eight critical elements outlined by Ashford-Rowe, Herrington, & Brown, (2014), which we introduced in Day 2 of this coffee course.

Students completed work integrated learning (WIL) in order to provide an opportunity to apply their knowledge in a real life practical setting and develop their analytical and problem solving skills. Students took part in a small research project and submission of an abstract. They were also required to create a digital poster for a mock conference presentation. 

“…these events provide a unique opportunity to disseminate knowledge involving visual and verbal communication as well as to demonstrate research skills and high order critical thinking. (Santos & Manuela, 2017, p576). 

Students were supported throughout the course to gain the skill and knowledge to perform tasks relevant to the Pharmacy field and were provided with feedback from peers and industry professionals who had also been invited to the ‘conference’, all making a solid recipe for authentic assessment.

Digital Narrative Production – Japanese Language Project (ANU)

Carol Hayes at ANU has written an article about her Digital Story Project, designed for students to to “develop the ability to express themselves in Japanese by writing and performing creative/imaginative texts”.   Students are scaffolded in stages to produce a digital narrative production.  Here is the YouTube video of an appealing mashup that was created of the student work! 

This type of authentic assessment is perfect for languages providing an opportunity for students to develop their communication skills culminating in a product.

Nursing students using augmented reality

It’s amazing what you can do with some great technology! Nursing students at University of Canberra use a Hololens in an augmented reality exercise in which they “examine” their “client.”

The authentic assessment allows transfer of learning and application of skill and knowledge in an environment that simulates the actual professional or work environment, and helps students develop their observation and communication skills.

Hands-on, real-world approach – Clinical Skills assessments

A standardised authentic assessment technique known as the OSCE (Objective Structured Clinical Examination) has been adopted across medical schools to assess student ability to conduct clinical examinations, using work stations with authentic work tasks and simulated patients. The authentic tasks represent real-life clinical situations assessing students ability to apply clinical knowledge and skills and perform under circumstances relevant to the profession.

Using Wikis and Blogs for students to collaborate on authentic tasks

Using online tools such as wikis and blogs, it is possible for students to collaborate in teams to undertake research and outline solutions – this is very useful for research or problem solving projects.  Pages can be allocated to students, groups or topics and students contribute their research and thinking.  Here is an article outlining how wikis can be used for authentic assessment (although please note that most of the linked examples they provide are no longer available.) 

Schools and universities will often use Wikipedia for their students to contribute to or create topics on which they collaborate to publish information. Here is a page which lists such examples.  Here is an example of a university academic’s effort to have students participate in the the Wikipedia space used by University of British Columbia, in a topic in gender studies.   Wikipedia is an online platform where anyone can create an account and start a topic, or edit and contribute to an existing topic, and as such, it can be a valuable educational resource.

Unfortunately many educational institutions relied on a site called Wikispaces to create wikis for their students but Wikispaces closed their platform a year or so ago and these are no longer available (one of the risks of using external platforms).  Moodle has a plug-in for OU Wiki which is a useful tool in Moodle for student collaborative assignments within an institutional e-learning environment.

Blog sites like Blogger.com and WordPress are still available for free for individuals and groups, and it is possible to set up a blog in a very similar way to wikis, with pages for topics, groups, or individuals to collaborate on sharing and publishing information.  For an interesting article with examples, of blogs in higher education, go to this collection of archived examples of “collaborative writing”  by Chronicle of Higher Education. 

ePortfolios for showcasing reflection and other skills 

Portfolios have a long history of allowing students to show their skills and knowledge in a different format to the traditional essay or exam. The advent of ePortfolio has expanded this idea to that of collecting examples of skills, achievements and reflections throughout a student’s learning journey. It is particularly relevant to authentic assessment, as it can be an authentic showcase of the student’s transferable work-related skills, knowledge and attributes. There are a number of examples, as well as a discussion of ePortfolio, on the University of Waterloo website.  

question markDiscussion 

Please share your ideas about any or all of the following : 

  • Do any of these examples provide you with ideas or inspiration? Have you used a similar approach to assessment, perhaps involving Work Integrated Learning, online collaboration, or a presentation of some type? If so, how did it go?
  • Considering the diverse backgrounds of students, what did you implement to support students to develop the types of skills, understandings and attitudes needed to perform well? Do time pressures and an ever increasing workload impact your capacity to factor in time and space to foster dispositions for learning in your course and assessment design? Share some of the challenges have you faced when designing and implementing authentic tasks and assessments?

References  

Ashford-Rowe, K., Herrington, J.,  Brown, C.,  2014 “Establishing the critical elements that determine authentic assessment,” Assessment & Evaluation in Higher Education, 39(2), 205-222, DOI: 10.1080/02602938.2013.819566     

Serrano S., Jose M., 2017, “Design, implementation and evaluation of an authentic assessment experience in a pharmacy course: are students getting it?”, 3rd International Conference on Higher Education Advances, HEAd’17, DOI: http://dx.doi.org/10.4995/HEAd17.2017.5294 p574-583

 

31 thoughts on “Day 3: Case studies and examples of authentic assessment

  1. The use of the Microsoft HoloLens to teach nursing at University of Canberra looks interesting. These units use Augmented Reality (AR) where the synthetic image is overlaid on the real world. I have tried a unit briefly, and found it much more comfortable than VR units which block out the world. This might be used by a group to work on a shared virtual object.

    Next week I am being thrown into Work Integrated Learning. One of the ANU TechLauncher tutors is off sick, and I am standing in. Four teams of students will have their projects “audited”. Their client, the other teams, and the tutor, examine each team’s work. We look in their online repository, read reports of individual contributions logged in the system, listen to a presentation, and query the team members. We then each rate the team’s efforts, and provide suggestions. This all goes into a bespoke system, and a mark for each team comes out the other end. The team members then rate each others efforts, dividing the mark between them.

    One issue we always have with team work is the differences between domestic, and international, students. It is a little curious lumping students together as “international”.

    With ANU TechLauncher there is time in a year long project for students to do something realistic. However, the resources needed to find enough real projects, and manage these, is very high. Some universities try to cut costs with virtual projects, and clients, but these lack the messy realism, and the white knuckle ride of a real project. 😉

    1. Hi Tom,
      The WIL TechLauncher project you are being thrown into sounds exciting. I like the inclusion of peer assessment for team members, I wonder if that helps mitigate some of the issues that occur in group work? Is there much provided to students in the way of expectations for group work, guidelines and/or opportunities to develop skills for working collaboratively?
      Thanks for your comment on the issue of resourcing real projects, this is indeed a major challenge!

      1. Amanda, at times ANU TechLauncher is exciting, but it is mostly a lot of hard work, and often frustrating. The peer assessment is a source of friction for team members. We had to ban teams giving everyone an equal score, as that way they avoid confronting the issue, and don’t learn from it. There is very detailed written documentation provided to students on group work, guidelines, and additional training offered (search for : “TechLauncher Course Outline”).

        One resource is Stephen Dann’s Lego Serious Play exercise. The programs run by the Canberra Innovation Network (CBRIN), for entrepreneurs are also useful. However, the challenge is, as always, getting students to actually USE THE RESOURCES.

        It is frustrating to prepare excellent material which hardly anyone reads, or arrange for a world class talent (such as Stephen), to put on a workshop, but then have hardly any students turn up. You can make it compulsory, so they turn up, but are they actively engaged? My approach is to use small doses of assessment on each activity, but that can undermine the WIL ethos.

        1. Hi Tom,

          No doubt it is a huge amount of work and effort. Kudos to everyone there! I have had the pleasure of doing Stephen’s Serious Play workshops at EdTech Posium and TELFest (Hi Stephen!). What a wonderful resource for students, together with CBRIN and the other training and resources you have described. It must be very frustrating when students don’t realise the value of these opportunities. It is always interesting to consider what effect making something compulsory has. It does indicate to students that it is important, but as you say whether or not they value and engage with it is another thing. I appreciate your ‘small does’ approach. All the best with the rest of the project.

          1. Hey people! *waves*

            It’s interesting in that Lego Serious Play, being an industry based protocol, has had so many of the industry reps who have encountered it in one of the teaching exercises get very excited, and the students are often much less excited. Like, I just did a workshop series for CECS/CPAS Professional Practice 2, and the students were “It’s a class room exercise” up until the industry rep was very excitedly explaining how this is the sort of thing they pay thousands of dollars to access, and here it is, in the room, for the students for free

            Somedays I am amused by the students who get very frustrated with the LSP sessions because they’re authentic exercises, and that set of students are very much of the belief that this is “The Real World” and there is “Education” without any overlap in the Venn circles. That student who is all “But if you’d told me it was a real world thing, I would have taken it seriously” is a constant source of amusement for me in dealing with MBA student types

  2. All of these examples sound really interesting and though I haven’t personally used any of these approaches, one of my friends was part of a sustainable development course where they were required to work in a group to create a wiki. They all selected the topic of the wiki and then had to decide who would work on each part (defining the problem, listing the issues and consequences, determining solutions, etc). I thought that it was a great idea as it allowed external students (of which my friend was one) to collaborate with the in-class students on a more equal footing. The students all provided feedback to other members of their group and then they were able to adjust their part of the wiki with their peers’ input in mind. This seemed like a great way to actively deal with feedback as I often find that feedback can be rather static when just given at the conclusion of a project. This way the students are able to more critically think about how they approach the feedback, whether they accept it or not, how they can implement it and can see how it improves their writing.

    The only problem with these type of assessments is that they do often seem quite time consuming (for both the teachers and students) and are a lot more interactive than traditional learning tasks, which could hinder some of the more introverted students initially. One way to counteract this could be to split the task into a few sections to allow the students to feel more familiar and comfortable with this type of learning experience; however, this could also just add more time onto the project. I guess it’s all about acknowledging the difficulties and finding a good balance.

    1. Hi Sarah, thanks for this great example of the use of an online tool, the Wiki, to enable off-campus students to have a meaningful collaboration experience with face to face students. The amount of work it takes to create these more meaningful and engaging activities and also for students to participate in them is certainly an issue. I guess it is all about working within whatever limitations we are faced with, but it takes a lot of creativity and ingenuity sometimes – this is where discussion and collaboration between academics might be helpful.

      Also the interactive and collaborative nature of group work can be off-putting initially to introverted students, as you say. Your suggestion of scaffolding to allow students to gradually increase skills and confident is one way of dealing with this.

  3. I agree with Tom, it is really frustrating when students don’t take advantage of resources which is usually the result of long, and hard work. When asked the students, many claim that being busy and time pressured, they choose learning resources only if they deem them useful and relevant. The question arises, as to what makes a resource relevant? How do students decide/choose?

    1. Hi Krisztina & Tom, these are great questions! I wonder how students evaluate what resources are useful and relevant in their decision making? Do you find that there is usually a difference between what you as the teacher deem to be relevant and what the students deem to be relevant? My instinct is that students might be focused on selecting resources that are focused towards assessment. As Tom indicates, this can make things a bit “inauthentic” as students are focused towards the marks?

      I did work on a course ages ago at another university (a third-year social science course) where the students were responsible for finding and discussing relevant readings for each week. I believe it rotated around so that each week there was a group responsible for locating a relevant resource, coming up with discussion questions, etc. This gave the students a lot of agency in terms of what they studied, but if I recall many students struggled with such a self-directed approach.

      I have led some course design/mapping exercises with academics before using a model designed by UOW, where all resources/readings have to be explicitly attached to either an activity or an assessment, so that the material is always directly connected to something the students need to do. This was a real challenge for those teaching teams because they had a lot of content they needed the students to get.

  4. In my anatomy teaching I’m using a ‘book’ on Wattle, which allows me to use Chapters (pages) to keep relevant material together. E.g. a Chapter on the spinal cord would include small videos of important concepts, my lecture notes, the practical class notes, and a link to an online lesson and/or quiz, and any additional information or resource that are relevant to that chapter. I find the students appreciate this organisation, as it guides them through the material. However, whether they use the resources or not, is the issue. I would have a number of pre- and/or post-laboratory quizzes, a chapter quiz to test whether they satisfy the learning outcomes (all formative), and when I look at the analytics, I would find that only 50-60% of students open the activities, and even less of them complete them. So I started to open and close the online quizzes at certain times to try to make the students to engage with them at the appropriate times (complete before the practical class, to come prepared), and yet, some students were clearly not using these, and came to the prac class unprepared, and in my opinion, wasted their time, as they could not possibly engage with the practical material or the demonstrators effectively.
    While many students appreciate our help, others look at the ‘extra’ resources and material as a burden. I do explain to them that by compiling these material is helping them to gain the basic required knowledge without the need for exploring and finding their own resources, some still choose to not follow this path. This is particularly interesting, because as Katie mentions above, some students struggle with self-directed learning. Of course, it comes down to individual learning styles and approaches, whether they like to find their own resources or want to rely on academics to give everything prepared on a plate. So really, in an ideal world we would provide individualised education, but we still need to reach common, expected outcomes, which of course means the setting of assessments which can reliably test the students’ abilities and skills, rather than whether they read a particular paper or completed a particular activity or course. My professors had a saying: “You do not need to study, you just have to know”, so maybe I should not be concerned about how my students get there, as long as they can prove they are there at the end. Sorry, instead of giving examples, I became a bit philosophical, but maybe we do need to consider our educational approaches a little bit from this angle, especially in view of the rapidly changing student attitude, university environment, and demands of the workplace, to decide which of the wide variety of educational and assessment tools are most useful in our discipline and what are the best ways to use them in our classroom.

    1. Hi Krisztina, a bit of philosophizing is always welcome! I think you’ve hit on some of the wider issues that connect authentic assessment to trends in higher education more broadly, which we’ll be discussing in more detail in the last post (Day 5). I struggle with this same issue, I think – my own knowledge expertise and hard work in designing a course in a way I think will be best for students, but then I also spend a lot of time working (in my head) reminding myself that students chose how and when they engage and that’s okay too! I’m still working on this cognitive dissonance.

      1. Although I should add that our team tried to design the coffee courses to allow for people to engage as much or as little as they want and have all options be okay! Our own philosophy is to reward contributions of any and all types and try to make the barrier to participating as low as possible (hence the certificate to acknowledge substantial contributions). But we have the benefit of working free of degree requirements!

  5. These case studies are interesting, but I wonder if there is a tendency to confuse technology/delivery/method with authenticity. A wiki and collaborative platforms are not authentic simply because they allow people to contribute to a group process or product. You can write on a piece of paper together too, but neither method assures any link to a criterion situation. Similarly, an eportfolio is not authentic in and of itself. Actually portfolio assessment has been around for a very long time, especially in K-12. It just means that the student gathers together a series of different tasks over time which show their learning, and often students are allowed a kind of constrained autonomy in compiling them. These methods don’t necessarily mean the learning is linked to the real world. A portfolio, and an eportfolio can include reflections (which we might also call essays!), quizzes, tests, role-play reports or videos and many other things. The discussions of authentic assessment seem to be pitted against things like multiple-choice tests, ‘merely’ or ‘simply’ rote learning or ‘regurgitating facts’. This is misleading. Actually there is nothing simple about remembering facts and there is also nothing simple about writing an essay on a well-conceived topic or doing a well-designed multiple choice test. I think we need to think a bit more critically about how we talk about these things. If we are scaffolding students towards a broad and deep knowledge of a subject, we may want to include some content-heavy methods, such as multiple choice tests as well as real-world experiences (application-heavy methods) such as the great ideas in the materials and discussions for this coffee course. I think authenticity in assessment is important where it’s constructive and appropriate, but I’m wary of the discourse around it which is sometimes a bit misleading.

    1. Hi Susy, thanks for that great point about what I might call “fit-for-purpose” authentic assessment. Similar to using technologies, they can often only be successful if they are integrated carefully with a range of other strategies. Similar to your points, I think applying more “traditional” assessments such as multiple choice etc are valuable especially in assessing content knowledge, with authentic assessments added afterwards for assessing evaluation, application, critical thinking, and teamwork skills, for example.

    2. Hi Susy and Katie,
      I agree we need a balanced and considered approach. How the balance is measure depends on the context and field of study for sure. Your points about tests and rote learning got me thinking a bit more. My thoughts are, if we rely heavily on multiple choice tests or high stakes exams we miss out on opportunities for students to demonstrate higher order thinking and creativity, the sorts of things we would see towards the apex of Blooms Taxonomy. The extent to which memorising, defining, duplicating or explaining of facts is important and whether this represents the desired qualities of a professional in the field depends on the field of study.
      My thinking about what would make the use of technology appropriate in an authentic assessment is if this is something students would be likely to encounter in the ‘real world’ or workplace (this type of technology or tool or something similar), and if it effectively supports the elements of authentic assessment such as collaboration. I think the idea with ePortfolio is similar, and also that ePortfolio itself is something which could be used beyond university or might be similar to what is used by professionals in the field. It definitely extends on the idea of traditional portfolio’s which still have their place, however in many instances an electronic portfolio is beneficial when working in digital global environments.
      I’d be interested to hear suggestions people may have about what works well for them and what doesn’t. Thanks for all of the great comments so far!

  6. On the subject of Lego Serious Play, and authentic exercises, one of my favorite parts of this process is when the students in my own subject no-show the LSP session, then complain that they didn’t get the benefit of the workshop. Honestly, there is nothing more authentic in the world than opportunity cost, but that lesson doesn’t translate well to the SELTS

    Same deal with emarketing – asking students to run social media accounts, to build products, and to set themselves a goal for a semester (eg what they want to achieve with the social media account), and reflect on the achievement (did the KPI get met/not met/exceeded?). All up, a v. intensive mode where I was needing to observe 30-40 instagram accounts, a dozen blogs etc… and the students that hated it outnumbered the ones who liked it who were willing to fill out the SELT, so I got the metaphorical rolled newspaper to the nose for my assessment expectations and tasks.

    Authentic is effort intensive, and I will be directed by my head of school / ADE /Dean to prioritise inauthentic low time cost, bulk processable assessment because time spent marking is time not publishing A* journals. If anyone can lowball low budget authenticity, let me know.

    1. Stephen I really appreciate your reflections on the impacts your experiences have had. Hard to appease students who aren’t able to make the most of what you offer. I’m curious if the SELT results are consistent over time with different cohorts? There’s usually resistance to something new and different, especially when it makes students work hard or think differently. I hope you have the opportunity to continue developing these types of tasks again, perhaps with some adjustments. It would be interesting to see if SELT results shift at all. The pressures, cost and workload you mention are very real issues. I hope these Coffee Courses and other types of opportunities for colleagues to network, develop and share ideas will lead towards solutions and greater valuing of (and resourcing for) teaching. Best of luck!

      1. My SELTS experience is a fairly consistent pattern – first year of the innovation is uncertainty, and a fair bit of “We do not like this new thing, no indeed, the new is scary”. A second year of the same format does well, and gets a much better score, and the third offering is consistently lower than the first year. It’s like “New thing, do not like” “Acceptable now we expect it” and “Boring, this was done last year”.

        The irony doesn’t escape me when I teach innovation in marketing through authentic assessment and get a “But we’ve done something like this before”

        1. Hi Stephen, I’m having some difficulty finding the citation for it right now but I do recall reading a paper that indicated that your experience is a common one – that student evaluations often lower when a new, innovative approach is introduced. In particular, the paper found that students who were told “this is a new, innovative approach!” reported dissatisfaction with this, and when the same approach was done in a future year and students were not told it was anything different, they reported higher satisfaction. I will keep hunting to try and find the citation!

          Interesting to hear that the third time it is offered, the feedback lowers again! I wonder what implications this might have – interesting to think about how universities might approach this. Innovate every (other?) year?

  7. Lots of experiences and reflections on student motivations that I completely share. More than frustration, I am genuinely confused as to why students voluntarily engage in higher ed when they firmly believe that they already know more than their educators. As the proportion of students sharing this mentality increases year upon year, I find it harder to constructively align content and activities with learning outcomes.

    I include this cohort of students when considering diversity in my classrooms; different approaches are certainly required for students who are intrinsically and externally motivated. Increasingly, I am finding myself needing to explain my pedagogy, or why a class is designed the way it is, to simply get students to engage with the materials, activities, and outcomes. I don’t mind doing this at the beginning of the semester, but it starts to eat into a significant amount of time when students require an explanation every single week. Even then, I am met with condescension and “this is a waste of my time”, etc, only to complain later on that they don’t get the basics.

    Wouldn’t it be nice if students understood that by choosing to enter higher ed, they are signing a social contract to come with an open mind? That, therefore, every single interaction shouldn’t be a fight to simply get your attention, and further, that there is no place for destructive attitudes?

  8. This comment is a bit wide of the initial question but it seems to be where the conversation thread has gone. I agree its incredibly frustrating when students don’t take advantage of the opportunities and resources that we supply, particularly when we go out of our way to be innovative and creative in our approaches. To combat this I make as much as possible in my course compulsory or assessable– tutes are compulsory, you need to submit a reflection on the weeks readings at the beginning of the tutorial thus guaranteeing students have actually done the readings etc etc. I do this because it finally occurred to me that its actually unfair to ask to students to constantly have to use their will power to do things that aren’t officially required of them. If we create a system whereby some thing are assessed and others aren’t, and students are trying to balance a bunch of other courses, work commitments, budding romances, uni societies etc, then we cant blame them when they prioritize doing what is assessed and leave everything else – we are literally sending them a message that says this is not as important as the other tasks because it is not contributing to your final mark. Although someone suggested that when students focus on marks the task becomes less authentic – but are not most employees hoping to get promoted, get bonuses, get tenure track jobs etc. Having your eye on the prize while doing a task it perhaps the most authentic of all experiences!

  9. I bet the students get a lot out the Japanese Digital Story project! I’d love to watch more of them! Is there a place where they are all stored? I really like the idea of creating something useful that can be built on each year. I was always very dissatisfied with exams because we weren’t allowed to see our papers to see what we had done well on and not. We could only guess at what was good and not. Essays at least would be returned to us with some feedback but even then are for our eyes only and never see the light of day again. Wouldn’t it be great if all that time and effort was put into something that would be useful for all participants and they could come back to over the years? When I was studying Primary teaching at UC, one of our assignments was to create a large poster on explaining a mathematics concept which was randomly assigned to us. We researched our topics, designed our posters in PowerPoint (I had hated PP but this assignment made me realise it has many handy graphical features!), printed and laminated the poster, wrote a rationale and presented it to our own classes. It was great to see what my classmates had done but I wished I could have seen the posters of every student in the course! It would have been fantastic if the convenor had arranged to have a gallery of all the posters and even better if all the PPs were uploaded into a repository by topic so year after year teachers could return to the poster library and use them with their classes! This could also be done with all sorts of finished products!

    1. I remember that poster assignment Rowena! Sadly my poster was never very useful during my years as classroom teacher (it was a bit of an obscure topic which never came up for any of the year levels I taught) but I wonder if others did make use of theirs at all. I think your idea of a gallery or repository taps into the importance of building in different opportunities for feedback and reflection as well as giving the work a life beyond the assignment due date.

    2. Hi again, Rowena! Again, your comment stood out to me 🙂
      I agree, the Digital Story Project looks like great fun for students – and for the teacher as well, I imagine! And this also resonated with me: “Wouldn’t it be great if all that time and effort was put into something that would be useful for all participants and they could come back to over the years?”. When I worked as a TESOL teacher, I strived to make this happen when I could. Here is an assignment where I felt this worked particularly well:

      TASK: Write a short biography.
      TOPIC: A person of your choosing. e.g. a family member, a celebrity, yourself
      CONTEXT: Up to you. You could pretend to write this for a local magazine celebrating an important birthday/anniversary. Or for a website or blog (e.g. GoodReads.com or Biography.com). Or perhaps for your university website or LinkedIn profile? Whichever one you pick, make sure to specify this in your post.

      Of course, I had some examples (which we analysed in small groups) and an assessment grid for peer feedback ready. Anyway, what students really liked about the assigment was that they had a few options to choose from. A number of them went on using the text they wrote on their LinkedIn/Research gate profiles – which I thought was great!

      1. Hi Melde, thanks for sharing this idea. I think providing some examples can be really helpful to students to help guide their thinking. Did you find any students who struggled with choosing in that context? I have found when giving students a range of options that they can feel overwhelmed or not sure how they will be graded when all the assignments will be different.

        1. They seemed to have an easy time deciding between the three categories (which were roughly speaking: professional, family, or celebrity biography). I had them brainstorm in class, and most students left with a rough idea of who they wanted to write about.
          I don’t think that grading was something that affected their choice or caused anxiety. I think because they were adult learners with jobs or full time study. The reason why I had different options is because my students had different motivations: some people wanted to improve their English for work or study, while others had an interest in the language or culture – and I wanted to cater to that. It’s also important to note that our school did not issue grades, but used the Common European Framework of Reference for Languages (CEFR) in all its assessment. So qualitative feedback rather than percentages.

  10. Hello!!

    The case studies are interesting and some of examples proposed are brilliant.
    Personally, within my discipline, I reckon WIL, maybe the blogs and the ePortfolio would be the most suitable examples for archaeology. Although we try to introduce AR and similar techniques into our archaeological research, I don’t see how it can be integrated in the classroom (yet).

    I have experience with blogs and portfolio. The experience with a blog/forum in wattle didn’t work to well, as you always get the same students contributing while some of them never comment. However, I think it is a good option for student that for any medical reason are not able to speak in public or struggle to present their research in front of an audience.

    I’ve been doing portfolios for 2 years. The way I design them is:
    – We have tutorial/laboratories where different topics or techniques are covered
    – Student get handout material with data about the topic and tasks that they complete during the laboratory/tutorial
    – At the end of the portfolio, there is a take home question that will submit as part of their assessment. By the end of the 6 weeks (the portfolios seems to work better if they are submitted in two batches), they submit their portfolio in Wattle and they receive feedback.
    Portfolios seem to work pretty good and they seems useful for students to reflect on the techniques and topics that were covered during the tutorial. Some of them struggle to address the questions proposed, but I reckon this is more related to them not asking for advice of to me, not being clear enough in the description of the tasks. I tried to solve this issue by explaining the task during the tutorial and by giving them constructive feedback. I have to admit that their performance is improving during the semester.

    In a course I’m convening this coming semester, I will implement WIL in a similar way to the one proposed in the first case study. My assessment has been designed based on two steps: first, students have to make an oral presentation of a topic chosen from a list of proposed topics or one of their choice, second, the topic will be presented as a poster presentation in a final “fake conference day” to the rest of students. I’m looking forward to see how this goes!

    1. Hi Sofia,
      In regards to your comment about students not always engaging in the blog/forum, I agree that it is still worthwhile to provide this type of tool. As you mentioned it can provide for great practice and help develop the confidence and communication skills of learners with diverse needs, learning styles and backgrounds.
      All the best with your WIL assessment it sounds great!

  11. They are all very interesting examples of cases of authentic assessment which are often familiar in the world of medicine – from OSCE examinations, to practicing trauma and resuscitation scenarios in complex and lifelike simulations. Part of the complexity of these sort of assessments is that they require students to be able to apply and integrate knowledge, often in a rapid timeframe. Whilst this might be the target goal for a course, or even a degree, I think part of the difficulty of providing authentic assessment is that it may not easily be suited to all levels of knowledge and understanding. For example, much of what is taught in the earlier stages of medicine, or in general anatomy requires students to have both understanding of general principles, but may require memorisation of names of muscles, characteristics of certain bacteria, or other important pieces of information. So, whilst the aim of the course might be to help students achieve a good practical working knowledge, at the start there are some basics required to reach this level. I think this is way medicine is able to initially start off with more traditional styles of assessment such as multiple choice and short answer exams, to eventually examinations that are all authentic assessment style scenarios and oral assessments. And one of the greatest benefits that medicine may have in planning assessments compared to other courses, is that it has the benefit of knowing that they can plan assessment for the same set of students over four years, rather than just a six month course.

    We’ve previously trialled mock examinations early on for first year medical students, and if you run them too early in the year when students feel relatively “unprepared”, they can sometimes seem to have poor uptake and just create much angst and concern amongst students. As we are often trying to ask stretch their knowledge and see if they can apply the basics of what they learnt, so some students appear to not want to try for fear of their perceived “failure”.

  12. These are good points Lucy. You are correct that the at the start of the medical course we tend to go more ‘traditional’, using traditional assessments to test students understanding of basic concepts and ability to recall facts. Although we try to put the questions in the context of clinical practice, as you say, students still need the basic knowledge to be able to use that later, in the more authentic situations in latter years. BUT we have a problem, the majority of students look at these basic studies as unnecessary, not understanding their relevance and therefore we often receive low rates in student feedback. Clinical skills teaching is always appreciated by students, because they can see the direct link between that and clinical practice. In the meantime, it seems that when mock examinations are offered, students don’t take those up for fear of failure. Which makes me wonder, how students perceive the role of assessments. Having mock exams would help them to evaluate their own learning and to adjust their study approaches if they feel something has not worked as well as they hoped/thought. Similarly, why would you not attempt quizzes or formative tests (online or otherwise) to give feedback to yourself? I wonder if students only look at exams as feedback to their teachers only. I was always amazed how many things clarified in my mind during an oral exam (summative) when I was at university. The stress of the exam, the need to prove myself and to show what I know lead me to the final points of learning, when all puzzle pieces dropped at the same time. It was a fantastic feeling (despite the fact that I was a nervous examinee). Again, went into philosophy here, but maybe we should explore how student see assessments, not what types they are, but why they are there, or why they are necessary. Maybe once we have this discussion, students would better understand what we try to do (being very optimistic here).

    1. Krisztina this is such a great point around how students understand the role of assessment (versus how the teacher might perceive it)! I may be being optimistic also, but I think hopefully as teachers we can work on being more clear with the students around expectations and goals for different assessments which might help? I wonder if it’s a time management issue for most students, i.e. only prioritising assessments that are formally graded / worth more marks? Has anyone else had similar issues?

Leave a Reply to Amanda Tutalo Cancel reply

Your email address will not be published. Required fields are marked *

*