Assessment and Feedback

Day 2: Principles of authentic assessment

Image of street sign which reads 'AuthentiCity' by Gerd Altmann from Pixabay

Let’s get real

As we employ techniques, tools and educational technologies to get students active and engaged we must also consider how our assessment of their learning may impact their levels of motivation and engagement. How can we spark their curiosity and thirst to know and learn? Authentic assessment could be a good way to give students opportunities to apply what they have learnt through connecting with others and the ‘real world’ by providing a product or service to the wider community. This may help students see the greater purpose to their study and gain increased motivation and satisfaction.

What do think about this, and do you think there are other benefits to authentic assessment?  

Elements and principles

As proposed on Day 1, authentic real life assessment tasks should contain the challenges of a real life work context. It’s worthwhile reiterating that the design and nature of an authentic assessment will vary depending on various factors including the field of study. So how do you know if it’s authentic?

Let’s take a look at some of the defining elements as set out by Ashford-Rowe et al (2014):

  1. The assessment should be challenging (beyond reproducing or regurgitating rote facts and knowledge)
  2. The assessment should culminate with a performance or product
  3. The design of the assessment should ensure transfer of learning and application of skill and knowledge (in the workplace or across different content areas)
  4. Metacognition is demonstrated, by means of critical reflection, self-assessment or evaluation 
  5. The assessment should be a product or performance that could be recognised as authentic by a client or stakeholder
  6. The assessment environment is true to the actual professional or work environment including the tools, and language used
  7. The assessment is formally designed to provide discussion and feedback 
  8. The assessment places value in collaboration including communication and team work (often critical in the work place) 

Going beyond

Image of a person stepping outside by fr_golay from Pixabay

According to Collins (2013), authentic assessment requires authentic learning experiences, so we can’t look at assessment design in isolation:

“The learning needs of today’s students no longer fit the traditional model. Rather than simply learning facts and basic skills, they need to acquire more complex skills in conceptualisation and problem solving. They need affective and metacognitive skills, and the capacity to work collaboratively and to work across disciplines. They need the dispositions required to pursue such learning. They also need learning experiences of the kind of tasks that they may expect to meet in adult life. Such learning requires authentic assessment, designed to demonstrate their grasp of the skills and competencies needed to address real-life problems, and formative assessment, or assessment for learning, designed to provide learners with feedback on their progress to inform their development.”  

In an article by Claxton (2018) he advocates for supporting students to develop attitudes and habits such as curiosity, resilience, adventurousness, resourcefulness and independence which not only influence learning and grades, but long term success in broader life. Although Claxton’s article is written with younger students in mind, these attitudes and habits are applicable to higher education and have implications for our course and assessment design.  

Our assessments fit in as part of a larger picture made up of course design which sits within the context of your particular field, the vision of the college, school or centre and the broader policies informing them.  Collins (2013) discusses the need to consider not only the content of the course we are teaching but how learning and thinking can be applied and transferred. An underlying principle of authentic assessment then, is to go beyond the core content of our course and foster dispositions for learning and skills for life such as problem solving, collaboration, communication, reflection and evaluation in order to provide authentic learning experiences and authentic assessment.

question markDiscussion

Reflect and share your thoughts on any of the following:

  • Are there any elements of authentic assessment or dispositions for learning discussed in today’s post that you feel do or don’t suit your particular discipline, teaching philosophy or pedagogical approach? (Click here to read about different pedagogical approaches, although it’s not framed within a higher education context it might be a helpful refresher to get you thinking about your approach) 
  • Tell us about an “authentic” activity in your profession or discipline e.g. law students conduct a mock courtroom role play. Is the activity a general learning experience or an assessment item? What sort of skills or dispositions are necessary for students to succeed in this “authentic” activity?

 

References 

Ashford-Rowe, K., Herrington, J.,  Brown, C.,  2014 “Establishing the critical elements that determine authentic assessment,” Assessment & Evaluation in Higher Education, 39:2, 205-222, DOI: 10.1080/02602938.2013.819566  

Claxton, G., 2018 “Deep rivers of learning in Phi Delta Kappan, 99 (6) 

Collins, R., 2013, “Authentic assessment:  assessment for learning” in Curriculum and Leadership Journal, 11 (7), May 2013.  http://www.curriculum.edu.au/leader/authentic_assessment_assessment_for_learning,36251.html?issueID=12745   Accessed 15/03/2019.

 UNESCO International Institute for Educational Planning (2018) “Effective and appropriate pedagogy” Brief 3, 29 March 2018

https://learningportal.iiep.unesco.org/en/issue-briefs/improve-learning/teachers-and-pedagogy/effective-and-appropriate-pedagogy Accessed 31/03/2019

28 thoughts on “Day 2: Principles of authentic assessment

  1. I have an “authentic” assessment which requires students to self examine their own skill set, and assess how they would be able to present themselves to the market as service providers. At the same time, I offer an ‘inauthentic’ 3000 word essay on the theory of service marketing. Students are free to pick, and inauthentic goes post Pareto with 90% uptake.

    For the authentic task, students need critical self appraisal, a level of self-reflection, application of theory, and application of practice (And largely they leave it the hell alone and go do the one that they can look up references on Google Scholar.). It’s also set to be a feeder assessment that can transfer over to the Entrepreneurship subjects or Marketing Strategy, as you design yourself as product/provider of products, and it’s a self-starter / self-startup design exercise. So, y’know, all authenticity all the assessment task.

    1. Hi Stephen, that’s an interesting response from the students. Do you think it is feasible to deny them the choice and provide them with some elements of reflective/essay type writing within the main practical task? To me, the practical tasks sounds so much more interesting and useful! Nothing wrong with ensuring they understand the theoretical aspect but can this be woven into a practical task?

      1. I mean, we can deny students anything we fancy. (We control the horizontal and control the vertical). In this case, the choice always existed because the subject played a role as an honours filter subject, and we ran an academic apprenticeship assessment stream, and an authentic stream, and most people went for the harder of the two gauntlets in the lit.review/academic apprentice stream.

  2. I wrote late yesterday about my “authentic task” in Phys2020. For 20% of the course mark, students are told to choose and “do” a mini project: (1) build a Stirling Engine, analyse (2) a drinking bird toy or (3) a Moka Pot, or pose/answer a question that involves a web-based simulation of Lennard-Jones particles (effectively a simulation of liquid, solid gas, along with presets such as a water balloon, ratchet & pawl, amongst many others).

    The students are given web-sites, a collection of components to assemble, access to 3-D printers and MakerSpace, Arduino microprocessors, sensors & amplifiers. Also important is a”the community” – students will find it beneficial to share thoughts/problems amongst themselves, but also with MakerSpace staff, CSAs and the lecturer.

    The skills required are of their choice. The ability to plan, execute, and assess is central to all. (I advise them to spend 20-30 hours on it, after they’ve decided what to do). Some make use of coding skills (Python is popular), learn/implement Arduino, learn/implement facilities in MakerSpace. They all need to provide a short report (written or video). In my first year of doing this one student’s video report was a skit/play written around her MakerSpace construction ! So I want students to dig deep and showcase their own skills, and be able to learn new skills to get the job done, the job they had lots of input in choosing. (As this is thermal physics, there are common themes to these mini-projects that reflect the course content).

    The biggest problem is assessment. The possibility of low marks is detrimental to students trying new things or being bold. (I tell them to put in 20-30 hours and the report/video can conclude that their project didn’t turn out as they thought). I provided a “rubric” which allowed for 20% of the mark to be given towards creativity (as some pointed out, this is subjective . . . . ). Its due (70 students) in week 10.

    1. That sounds like a great assessment task, reflecting much of the complexity of real life. The issue of the high stakes mark for such an assessment can be problematic – inevitably it will bring negative student feedback as they find multiple reasons why it is unfair. Breaking it into components with different values (as you seem to have done) can help mitigate this. Maybe even putting most of the marks on a reflective writing piece about what they achieved/didn’t achieve through the experience of the tasks.

      1. Yes, indeed last year, there were significant irritations about the assessment. This year I took advice from John Debs, who administers the MakerSpace, and made the Rubric that I will use as a checklist for marks. This is the breakdown of marks – perhaps the marks are too easy to get (whiplash from last year!)
        1.Does the report/video introduce the task and show your conclusions? (0-2)
        2. Does the report explain your method, using graphs, illustrations, physics-based explanations?(0-4)
        3. Does the report demonstrate that you understand the underlying physics?(0-5)
        4. Is the report clear, organised, easy to follow and within length limits?(0-5)
        5. Does the report demonstrate your innovative, inspired or imaginative questions, workings and/or solutions?(0-2)
        6. Is the video/report itself, uniquely creative ? (0-2)

        I think (hope) this will alleviate the disruptive terror of assessment. Some of my colleagues may think that this rubric is too lenient (?). I told students that the last 2 rubric points must be worked on from day 1 of your project – these are not points that you first address when you start writing your report/video. (Also the video and report can contain “video clips” much like a “scrapbook” of important project activity.)

        I think the assessment is really important and I don’t know if this the correct approach….

        1. HI Edie, I am very interested in what other teaching academics participating here have to say by the way of feedback about your rubric. I think it is great as it is written in plain English and there cannot be any real problem in interpreting what is required. Even though it is written simply, the tasks themselves are quite complex and appropriate for a university level assessment, in my estimation. But I’m not a teaching academic so would be keen to hear what others think.

    2. Ages back I ran a computer simulation exercises, 20% game result, 20% documenting what happened and why. It clustered to the middle of the distribution like you’d not believe, because the students who were doing badly in game suddenly had a story of the autopsy/post-event review that could show where they tanked it, what they could have done differently. Students who did well in the simulation often struggled to explain how they attained success, because post-event success rarely gets the same post-event debrief. “You did good, do it again” versus “Now can anyone tell me what went wrong?”

      Class said they’d never had a learning experience so confronting in nature, and loved it, but it put the bell curve through an industrial mincer. That said, getting people to unpack success as an authentic experience is quietly surreal – we rarely examine what goes right anywhere near as well as we analyze failure, so are we teaching an ‘inauthentic’ and novel experience when we do something better than industry?

      1. Stephen, fascinating points! I don’t have an answer, but they are most interesting and important to ponder.

  3. The mention in today’s post of the traditional model being about “learning facts and basic skills” sounds odd to me. Traditional education happens in the workplace, where a master shows an apprentice what to do, and the apprentice has to demonstrate increasing levels of knowledge and skill. The apprentice will traditionally have to undertake one last test, which they demonstrate mastery. The apprentice cabinetmaker would, for example, make a small intricate cabinet, the apprentice teacher would teach a class with their instructor observing, a trainee pilot would fly solo: if they don’t crash, they pass.

    The university and school approach of written tests, are not traditional, in my view.

    In my disciplines of computing and education, this apprenticeship model is still practiced in a modified form. I was trained in computing in the Australian Public Service, starting as a “programmer’s assistant”, working under expert supervision, and occasionally going off to formal classes. One extreme example was after learning about a particular model of computer, the instructor disassembled it into a pile of components on the floor, and we had to put it back together again and get it to work. In a modified form, this is how ANU’s TechLauncher program, and internships work. The APS has formal internships and cadet-ships today.

    Similarly with education, as well as theoretical exercises involving written assignments, to show I had mastered distance education techniques I have had to design and deliver a small course module (with my peers as students), design a learning App, and deliver a presentation, then be questioned by my instructors and peers.

    Such activities are both a learning experience, and an assessment item. These test specific areas of technical knowledge, plus the ability to communicate, work in teams, and not panic. As I tell the innovation students I mentor: any pitch you give where you do not run from the stage screaming, is a good pitch. 😉

    1. Tom, Good point on what “traditional assessment” actually is. I think in referring to traditional university assessment in that way we were thinking of the traditional scholarly models rather than the more ancient apprenticeship models. Computing really does lend itself to an apprenticeship model, as your interesting examples so clearly show.

  4. There are many benefits to authentic assessment. For me, personally, I think it’s a great opportunity to put your studies into a broader context. We’ve all heard the saying “but when am I ever going to need to know how to (insert exam question here) in the real world…?” Through authentic assessment, we have the opportunity to show an example of how they can apply those skills.

    I highly agree with all but 2 of the elements from Ashford-Rowe et al (2014). I think that point 6 can be tough as different language and jargon terms vary across different jobs and even within a research area. I work in planetary sciences, which is a highly multi-disciplinary field, requiring a combination of physics, chemistry, biology, and geology. Through my research, I have found a drastic difference in language and terms used (often having two different terms used to define the same phenomenon). Therefore, I’m unsure that it’s a reasonable goal to stay “true” to the language used in this time of increased emphasis on interdisciplinary studies. As for part 8, while I do agree that communication and teamwork is an invaluable skill to have in the workplace; however, it’s usually individuals bringing their own skill to the table and the group splits the task based on expertise. Within a university context, I often find that the group work involves all students working on the whole project, which I don’t believe promotes the type of authentic experience or skills we are aiming for. If done well, I think group work has the ability to be of benefit to the students; however, I think that perhaps we need to gather more information from people in the professional world, communicate between different disciples for a cross-disciplinary project and generally put more time and thought into formulating these activities.

    I haven’t come across many “authentic” activities in my current field, but within my undergraduate arts degree we often had projects spanning across the semester involving a practical aspect (worth 50% of our grade) as well as a companion “journal” (worth the other 50%) detailing our thought processes, decisions we made, etc. For example, in a third-year music subject, I had to compose a piece of music and get it performed, while journalling my experience. This required me to combine my creative skills with more practical “realistic” aspects of gathering and collaborating with a group of performers, conducting practices, and organizing an end performance, before reflecting on what I did and how I could improve it next time. While at the time this reflectance/journaling seemed tedious, I think this an incredibly important transferable skill, as in order to progress in any activity/job it requires a certain amount of reflectance on how to improve to make the process more efficient in the future.

    From what I’ve observed, it seems to be more common to see authentic assessment tasks incorporated within creative arts and social science subjects. It’s tough in science as we need to ensure the students are learning and practicing all the concepts/laws/theories being taught, so not all focus can be turned to a semester-long project. However, I think that this just means that there is room for improvement in the structuring of science assessments. Edie’s Phys2020 authentic task is a great example of this – a smaller scale project that allows the students to discover what it means to do science in the real world, while only taking up a portion of the time, and not having it make or break their grades. If we can incorporate just one of these smaller scale authentic tasks within each subject, I think we can better prepare the students for life after uni.

    1. Hi Sarah,
      That’s interesting to hear about the challenges when using specific language in multidisciplinary fields. I’m wondering about students who may go on to work professionally in multidisciplinary fields, they must appreciate exposure to the different language used and to gain some sort of understanding that these differences exist?
      I agree group work can be both beneficial and problematic. As you mention if we can find out what type of communication and collaboration occurs in professional settings (are there clues in the professional standards?) and support students to gain and demonstrate these skills, they will no doubt benefit from this beyond their university years. Whilst being able to work independently is highly desirable, most would agree that employers are also interested in seeking people who communicate effectively and can work well in teams. Despite the challenges I agree with you that it is definitely worthwhile to put time and thought into carefully planning and designing any group tasks.
      It sounds like networking with other academics in Science to incorporate smaller scale authentic assessment tasks will be a great way to enhance the learning experience for your students. I’m interested to know how you or others might go about this, whether it’s something you’d see happening as part of course design and if you already have access and availability to meet up with collaborators to support implementing this?

  5. It has been really interesting to read the experiences from different fields. Certainly pointed out some discipline-specific issues, but also raised another important point: students’ perception of the assessment. Students are more comfortable with examinations they already know and can prepare for by simply reading lecture notes/books….So I think to make the examination authentic and have students come on board is to prepare them for the assessment over the semester/course. I find that by developing transferable skills, such as teamwork, observation, description, reflection could be introduced in small exercises (in lectures, small-group tutorials or practical sessions) to build confidence and finally use the authentic assessment where they can demonstrate their ability to use these skills. A few years back, I was convening a neuroscience course, where we had a tutorial session for students to dissect a scientific paper, by analysing the presented results, reflect on them and make a judgement as to the outcomes. Then we had a brainstorm session, where I gave the group a research question and hypothesis, and they worked in small groups to come up with research designs to prove or disprove the hypothesis. After that, the course assessment was presented: working in small groups, students needed to research a neurological disease (groups’ choice from a given list) and present their findings a 20-30 min seminar. Importantly, everyone had to take on an aspect of the research (basic science, clinical features, treatment….). This allowed students to choose and focus on an area of their interest depending their future professional plans (academia, research, medicine…etc). Then finally, the group needed to produce a grant proposal, based on the research into the disease, ad identified gap of knowledge. Admittedly, I thought that this last part will be very hard and students would falter, but to my great delight, they seemed to really enjoy the exercise and came up with some fantastic ideas and clearly were able to integrate knowledge not only from the course, but also from other fields they have studied in their program. This assessment allowed students to gain a umber of different skills, as well as knowledge, and a meantime they had quite a bit of fun. I thought this prepared them to the new exam format, where they were given problems to solve, and give their response in short answer question format. Interestingly, students found the exam hard, and coming out of the exam room, many thought they surely failed, I have to say, the overall results were really good, and demonstrated that students indeed understood important concepts and were able to apply them.

    1. Hi Krisztina,
      Thanks for sharing your experience and insight. I noted your comment about student’s perception of assessment and also their misconceived feelings of failure; in Day 3 of this Coffee Course we look at a case study by Santos and Manuela which uncovered an interesting malalignment between student and assessor views when reflecting on elements of the assessments authenticity: “The results revealed that students found it hard to value their performance at the higher level that stakeholders do” (2017, p574). They propose the challenge may lay with students being used to more ‘traditional’ forms of assessment. As you mentioned, many students are more familiar and comfortable with an examination they know how to prepare for. Your strategy of preparing students for the assessment over the semester/course and scaffolding their development of the skills required has obviously paid off in the overall results!

  6. Sorry I joined in the discussion a bit late [grant writing deadlines :(] but I also wanted to ask what the group’s opinion is on what is authentic. I’m teaching in the medical school and I find that in professional programs, such as ours, or engineering, it is relatively easy to come up with authentic assessment, by just simply looking at what the governing bodies of these professions (AMC, IEEE) stipulate as graduate attributes, and think about the everyday activities and demands the profession will require from our graduates.
    However, I often ask myself, what the graduate attributes are/should be for an undergraduate science student? Many are not sure where they want to go after graduating, not sure where they will end up working, or even what their interests are. So, my own response was that if I can teach them the skills of identifying problems, questioning results, set hypotheses and know how to test them, and finally being able to solve problems would be the desirable outcomes. What I find as a limitation is that somehow it seems that many students don’t have an inquiring mind, and I blame the system, which trains them to study for examinations and to go for safe solutions. We need to open our students’ mind and give them the basic skills that will allow them to attack any problems and come up with solutions.
    I’m one of the convenors of an art-anatomy course which is offered to both art and science students and it always stunned me how differently these 2 groups of students dealt with the demands of the course. While art students were ready and willing to explore new ideas, science students (at least in the first few days of the intensive 3-week course) concentrated on trying to find out what they need to know or do to get a good mark. It takes a lot of work from us, the academics who deliver this course to encourage them to relax and try even if the product is not perfect. We assess and value the trying part! It is good to see that later they start to relax and go with the flow, but not all are fully comfortable with the idea of possible failure (not producing a perfect drawing or sculpture), and many are struggling with the process of self-reflection and change. (BTW, we do not have a written test, the final assessment item is an exhibition piece, which in on display in the art school at the end of the course). I think we would help our undergraduate and undecided students by helping them to gain back the inquiring, experimenting and open mind what they all had when they were still just toddlers.

    1. Hi Krisztina,

      You make a really good point! In a previous coffee course, we briefly touched on learning how to learn. It does seem that many students are no longer equipped with those basic skills, or perhaps feel that they don’t have the luxury to experiment and fail.

      I’ve had a Masters thesis student who, when their results didn’t support their hypothesis, honestly thought that they had nothing to show for their months of work and that there was no point submitting. Had to sit them down and explain the immense value of an experiment not working. Why their results didn’t support an intuitive hypothesis opens up even more scope for research!

      Are we doing our students a disservice by not challenging them enough and letting them fail more often? Although a culture of hand-holding and spoon-feeding has developed in higher ed (well, at least in this end of campus), are we truly equipping our students to be productive members in the real world? While all of the professional and work environments I am familiar with prefer success, they *need* people who can identify, and then have the skills, tools, and language to rectify, setbacks and failure.

    2. Hi Krisztina,

      Lucy here!
      I loved the course but also initially struggled with the relatively free “experimentation” side of the course, and also struggled to get students to relax when I later helped tutoring.

      I think it’s so engrained in many science students that there is often a “right” result or method – you harshly loose makes if your lab report doesn’t meet stringent requirements, or if your lab values don’t closely fall within a strict range. For all those hours we spend in labs, there often isn’t much true “experimenting” going on!

      Despite the struggle for me to initially accept the idea of the course, that course was one of my favourites all degree made me with I’d not necessarily used all my electives for pure biology subjects…
      There were a number of other courses in my undergrad degree which also challenged the general biology student perception of just finding, or memorising the right answer – “Creating Knowledge” (an interdisciplinary VC course), the art and anatomy course, and then doing two different research projects in third year. I think science students possibly aren’t encouraged enough to branch outside the box of try new things…

      1. Lucy, I fully agree with you. We should encourage science students to ‘branch out’, so that they can gain more valuable skills, beyond the training of writing the perfect lab report.
        My concern is that, by setting strict guidelines and expecting students to follow them, we reduce the opportunity for using creativity in learning.

  7. Ironically, I require students to design an assessment as assessment! I teach something called ‘assessing language’ and for their final assessment, students need to design a new assessment or validate an existing one. This might sound a bit trivial, but think of the millions of test-takers that do IELTS each year. These big tests are an industry which employ applied linguists. So the ‘real world’ for an assessment designer is designing assessment. We consider things such as authenticity in tests, for example how authentically IELTS tasks represent the criterion situation (i.e. what you are all doing in your courses!). Some students actually choose to focus on authenticity in academic tests. They need to back up their design choices with research and literature, and as in the real-world, they can do this in groups (although I find it fairer to assess them independently so I require separate reports). In general, they love the fact that they are doing something which has an industry. Another ‘authentic’ assessment in the course is that they have to critically review an existing assessment. In our field, reviews of assessment designs are regularly published, so it’s from the academic domain, not the professional one. However, I am considering implementing a ‘traditional’ assessment in the form of a test because the real world task doesn’t have quite the content coverage that I’d like. There are lots of aspects of the course that are not captured in the doing of real-world tasks and I’d like an efficient way of checking that they are across other areas of content. There is a strong argument to be made for a program of assessment that includes professional simulation as well as other means of checking content is covered. As others have noted, we never know where our students will end up!

    1. Hi Susy, I really like the sound of your course! It is definitely not trivial at all! One of the best courses I ever did was ‘Language Testing and Assessment’ taught by my favourite lecturer of all time, Dr Jeremy Jones, for my Masters in TESOL at UC. Jeremy had an open door policy so was always up for a chat, he was incredibly passionate about his field, had a great sense of humour and delivered very practical and interesting lectures. For the assessment in that course, our main assessment tasks were 1.To evaluate English language tests and 2.Design our own valid, reliable and authentic assessment tasks for feedback and 3.Trial them with target students, collect feedback and make amendments. It was a great learning experience which enabled me to design and select appropriate assessment tasks into my teaching career. Luckily by that stage, I already knew that I wanted to work at the Adult Migrant English Program so I chose to evaluate and design assessment for the competency-based Certificates in Spoken and Written English and was really motivated to apply everything I was learning. Had I not known where I wanted to work, it may have been useful to have a more general assessment to ensure I had absorbed all the key messages but for me the main tasks were just what I needed.

      1. Having a TESOL background, this thread really sparked my interest. I too have often wondered how multiple-choice language tests can be made more authentic (and valid!). I would be curious to learn more about your course, Susy! And the ‘Language Testing and Assessment’ class you mentioned, Rowena, sounds incredibly interesting as well.

        When I started out at my previous place of employment, they had 5 assessment categories: knowledge, speaking, listening, reading, and writing. But after I had been there for about a year, they dropped the ‘knowledge’ category. Instead, knowledge had to be assessed via the 4 remaining categories. Another change was that they stopped using percentages in report cards, and instead wanted teachers to give qualitative feedback. Both changes made my teaching practise so much more enjoyable! No more awkward ‘fill the gap’ grammar exercises, and no more students stressing because they scored 2% lower than they did the year before 🙂

  8. So many interesting examples that everyone is doing around the campus – its really inspiring! I developed a semester-long piece of assessment where I get the students to run their own grand challenge. Grand challenges have become an increasingly popular globally to address intractable, complex problems. They thus offer an opportunity for students to extend their knowledge into practical, real-world examples, in a way that isn’t job specific but that teaches both technical and soft skills that will last them a lifetime.
    For the first 6 weeks of semester students work in small groups to conceptualize a complex environmental problem that needs addressing by society. They analyze their problem and identify the causes and possible leverage points and then create an evaluation criteria for judging solutions. This culminates in pitching their problem as a video presentation. Each group’s video is shown in their tutorial and the whole class votes on the problem they would like to work on for the second half of the semester. After the mid-semester break students form new groups and work together to come up with innovative solutions to the chosen environmental problem. This culminates in each group pitching their grand challenge solution to the class.

    Conducting a grand challenge is very different to anything students have done before, so we offer them just enough scaffolding at each of the elements to succeed at the task. To promote a constructive alignment between the course content and the assessment I set up an hour every week to work on specific skills relevant to running a grand challenge, including problem analysis, design thinking, monitoring and evaluation and theories of change. I also ran a panel with people who had run or competed in a grand challenge including guests from the ANU Chancellery, the Climate Change Institute and the Department of Foreign Affairs and Trade’s InnovationXchange. The students then take these discrete elements of knowledge, analysis and critical thinking skills and put them together holistically to manage their the different aspects of the challenge.

    Running such a novel piece of assessment and one that cumulatively contributed to up to 75% of students course mark meant that it was important that students could engage in real time teaching evaluation. Students indicated they were very excited about engaging in this assessment however I felt that any apprehension they may have felt would be more honestly expressed anonymously. As such every single week students submitted short feedback notes at the end of each class. This allowed for incredibly rich data collection, both more specific than broad course-wide feedback often is, and far timelier. I could gauge week by week when students felt overwhelmed or when they felt on track with the assessment.

  9. Thanks for your comment, Rowena. How interesting that you worked on the CSWE! My students mainly enjoy the experience of designing and a design something too, though it is a lot of work (which they don’t love). Yes – I have a mix of students who know exactly what they need for their career or even current workplace, and they usually want to keep their focus strongly on that area, but I have plenty who just want to end up in education somewhere or who actually don’t quite know how they ended up in my course. In any case, across my courses (SLA, assessment, sociolinguistic courses) I now mainly use assessments that can be personalised to students’ contexts of interest (and are authentic in different ways) but I have come round to thinking that they really could do with at least one more general assessment, such as a test) which ensures they are across more areas than their own well-developed projects.

  10. I chose: Tell us about an “authentic” activity in your profession or discipline e.g. law students conduct a mock courtroom role play. Is the activity a general learning experience or an assessment item? What sort of skills or dispositions are necessary for students to succeed in this “authentic” activity?

    As I explained in my previous post, I conduct a couple of ‘authentic’ assessment for my analysis of vertebrate remains course. They have to identify bones in a practical test, they also have to create a database from the analysis of real archaeological bones using the criteria that would be used in any professional analysis and finally, postgrad have to write a scientific report based on their results. Although they are three isolated assessment, I reckon it may be considered a learning experience, as one of the assessment needs to be complete after the other and the skills you learn in the previous one will be applied in the following assessment. Students need to be able to apply the skills and theory they have learn during the course to successfully complete their assessments.

    This coming semester, I will also convene a course in Forensic Archaeology, where one of the assessment (which comprise 50% of their final mark) is a mock crime scene investigation, where in groups, they analyse the different evidences. Each evidence in analyse by a few members of the group (each group is between 10-12 students), who then share their observation to the rest of the group. At the end, the whole group have to submit a final report (individual to each group member) where they assess the crime scene and assess the cause of death based on the evidences. Students need to be able to communicate among them and to share their knowledge, as well as be able to make a coherent narrative of the scientific evidences.

    a few of my colleagues in archaeology conduct ‘authentic’ assessment, such as mock excavations, reports based on museum objects that they relate with anthropological and ethnographic data, etc. As I mentioned in my previous post, archaeology has the potential to include skill-based learning and teaching in many courses. However, I think a better effort should be made from the discipline members to integrate our different approaches and also to collaborate with other disciplines, such as Digital Humanities (which I’m currently collaborating with), but computer science or geochemical and other applied sciences disciplines

  11. It was also interesting hearing the thoughts by Claxton (2018) on how we should aim to develop and foster skills for life such as problem solving, collaboration, communication, reflection, and evaluation through both authentic learning and assessment. This reminded me of the assumptions made in theories of adult learning (andragogy). As in, that our learning and assessment should guide and foster students towards adult learning.

    1. Oops, didn’t copy the whole comment across and didn’t check it before I hit submit!

      Here is the whole comment:
      I feel that medicine is extremely well suited to both authentic learning, and authentic assessment. As I mentioned in my last comment, both the learning and the assessment within medicine becomes more “authentic” as you get closer to graduation. However, the initial half of the medical degree occurs mostly in a more traditional classroom environment, followed by two years primarily on placement in the healthcare setting. This makes for quite a big and challenging transition for students.

      Within the area of medicine I teach, primarily in the pre-clinical years, both learning and assessment is generally less authentic. However, we try to bring “authentic” components to our teaching – when discussing certain anatomy getting students to discuss how this is related to clinical assessment, or how the anatomy then corresponds to imaging, and how particular pathology might alter anatomy, and how that might appear on imaging. We have also previously tried running mock exam sessions where students are asked to perform an examination and then answer related anatomical questions with time constraints. I felt that when we ran that session, it was helpful as it allowed students to consider how they might be able to more practically apply their knowledge and undergo metacognition and self-reflection. It was also helpful as a tutor as it gave us insight into where students may have gaps in their knowledge and also reminded us to try and help draw links between anatomy in the lab and clinical applications. So whilst that mock exam session was not necessarily very “authentic” in terms of the workplace, we were aiming for authentic mock-assessment in terms of their exams.

      It was also interesting hearing the thoughts by Claxton (2018) on how we should aim to develop and foster skills for life such as problem solving, collaboration, communication, reflection, and evaluation through both authentic learning and assessment. This reminded me of the assumptions made in theories of adult learning (andragogy). As in, that our learning and assessment should guide and foster students towards adult learning.

Leave a Reply

Your email address will not be published. Required fields are marked *

*