Assessment and FeedbackQuiz Design

Day 1: The science of learning and the value of quizzes

Post written by guest author Alexandra Webb, ANU Medical School.


Quiz! (word)

Urban legend has it that the word ‘quiz’ was invented in 1791 by Richard Daly, a Dublin theatre owner, in order to settle a bet. Daly bet that he could add a new word to the English language within 48-hours. In order to win the wager, Daly sent out all of his stage-hands during the night to chalk the letters “QUIZ’ on doors and walls all over Dublin. The next day the word, which no-one knew what it meant, became the talk of the town. Daly won the bet!

Today, we find quizzes everywhere – on TV, at the pub, in newspapers and magazines, and in online games and apps. But how are quizzes relevant nowadays in university education? And what is the best way of delivering quizzes? The aim of this course is to examine the value of quizzes in education and explore important aspects of quiz design. By the end of this coffee course we hope that you will be equipped with the necessary tools to support the creation of quizzes in your own educational setting.

Thus, practice tests, such as quizzes, during study increase the likelihood that information will be recalled at a later time and are not simply a measure of prior learning. This is encouraging justification for educators to incoporate quizzes into their educational practice.

The benefits of practice tests have been demonstrated to not be dependent upon the type of quiz implemented (McDaniel et al., 2007; McDermott et al., 2014). This gives teachers a great deal of flexibility to develop quizzes that best suit their discipline or educational environment. On Day 2 of this Coffee Course, you will be introduced to a spectrum of question types so that you can create quizzes that best suit your discipline and educational delivery methods and provide variety to your students.

How do we optimise the use of quizzes for learning in our educational practice?

  1. Provide feedback
    The benefits of practice tests are even greater when the tests are followed by feedback, especially for items that are incorrectly recalled. There is also evidence that for more complex information, feedback aids students that get the answer correct, especially if they select a correct answer with low confidence (Butler et al., 2008).  We will look at feedback further on day three of this course.
  2. Spaced retrieval
    Spacing learning over time was our other most effective method of learning. Therefore, we can maximise the effectiveness of quizzes by providing students with quizzes spaced over time. The greater the amount of spacing between quizzes (retrieval events), the greater the potential benefit to retention.
  3. Low stakes
    It is recommended that quizzes are used for low stakes testing and do not significantly influence course grades. (Roediger et al., 2011)

question mark


What experiences have you had with quizzes in teaching, either as a teacher or as a student?  What role did you find the quizzes played in your learning or the learning of your students?

How can quizzes be incorporated into your course(s)?

There is an extensive variety of ways in which quizzes can be incorporated into your course(s). In today’s university settings, many of these methods are delivered using educational technologies. Whilst there are many effective non-technological solutions to delivering practice tests, it is worth noting that online quizzes have a number of benefits including:

  • Potential for students to do the quizzes and receive immediate feedback anywhere, anytime
  • Automated marking protects academic time
  • Access to analytics and reports on how students are performing which can be used to address learning gaps and inform course improvement
  1. Student response systems to quiz students during a class
    One benefit of student response systems is participation by all students. The immediacy of the results also enables educators to act immediately to address any misconceptions. Student response systems can also be interspersed within some educational settings such as lectures to enhance student attention and focus on the task at hand (Szunar et al., 2013).  You can read more about this in Technologies for active learning and Interactive activities that can be used in lectures.
  2. Using online quizzes to test students before or after a class
    Pre-class quizzes can be used prior to an active/interactive class to test student remembering and understanding of key concepts and rudimentary knowledge covered in pre-learning resources such as videos, articles, online modules etc. Post-class quizzes could be used after an active/interactive class to provide students with an opportunity to exercise higher cognitive processes such as anlaysis and evaluation. These could also be achieved without technology e.g. students write down answers on paper and peer- or self-mark; think-pair-share task; 1-minute paper; scratch cards.

Students can also incorporate testing into their self-regulated learning. One common strategy reported by students is the use of flashcards to test basic facts and concepts. At the ANU Medical School, a student-led initiative has been the generation of online flash cards using Anki. Fourth-year ANU medical student William Maish was awarded a Vice-Chancellor’s 2019 Citations for Outstanding Contribution to Student Learning for creating 1000s of Anki cards for medical students at the ANU and beyond to use to aid their learning.

Image to represent spacing leanring over time.A Few Final Words

The long-term memory benefits of practice testing are hopefully reason enough to advocate the incorporation of quizzes into your course(s). But in case you are not yet convinced, here are some additional benefits of quizzes:

  1. Facilitates the identification of topics that students are struggling with so that the educator can intervene e.g. revisit or spend more time on the topic
  2. Facilitates student reflection on topics that they are having difficulty understanding
  3. Enhanced organisation of information by the learner
  4. Facilitates retrieval of material that was not tested
  5. Improves the transfer of knowledge to new contexts

For more information on the additional benefits of retrieval practice see Roediger et al., 2011 Ten Benefits of Testing and Their Applications to Educational Practice.

question mark


What (new) ideas do you have to incorporate quizzes into your teaching and learning environment?  What technological solutions have you used, or do you plan to use, to aid your delivery of quizzes?

Further readings


  • Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. (2013) “Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology.” Psychol Sci Public Interest, Vol 14, issue 1, 4-58. Available:
  • Butler, A. C., Karpicke, J. D., & Roediger, H. L. III. (2008). “Correcting a metacognitive error: Feedback increases retention of low-confidence correct responses.” Journal of Experimental Psychology: Learning, Memory, and Cognition, Volume 34, issue 4, 918-928.  Available:
  • Roediger, H. L., Agarwal, P. K., McDaniel, M. A., & McDermott, K. B. (2011). “Test-enhanced learning in the classroom: Long-term improvements from quizzing.” Journal of Experimental Psychology: Applied, Volume 17, issue 4, 382-395. Available:
  • Szpunar KK, Moulton ST and Schacter DL (2013) “Mind wandering and education: from the classroom to online learning”. Front. Psychol, Volume 1, issue 4, 495. Available:
  • D’Antoni, A. V., Mtui, E. P., Loukas, M. , Tubbs, R. S., Zipp, G. P. and Dunlosky, J. (2019), “An evidence‐based approach to learning clinical anatomy: A guide for medical students, educators, and administrators.” Clin. Anat, Volume 32, 156-163. Available:
  • McDaniel, M. A., Roediger, H. L. III, & McDermott, K. B. (2007). Generalizing test-enhanced learning from the laboratory to the classroom. Psychonomic Bulletin & Review, Volume 14, issue 2, 200-206. Available:
  • McDermott, K. B., Agarwal, P. K., D’Antonio, L., Roediger, H. L. III, & McDaniel, M. A. (2014). “Both multiple-choice and short-answer quizzes enhance later exam performance in middle and high school classes.” Journal of Experimental Psychology: Applied, Volume 20, issue 1, 3-21.  Available:

27 thoughts on “Day 1: The science of learning and the value of quizzes

  1. Thanks Alex, for making that all so clear and concise. I have read before about the memory-making value of practicing recall, and it certainly ties in with my experience and observations. However, I have often wondered whether there can be a negative impact around the memories created by the distractor answers in a typical mulitple choice question. Initially I may need to remember just one fact, e.g. the colour of an orange is orange. If I’m given a question like “What colour is an orange: [red, blue, orange, purple]”; I now have three new associations and three new facts to remember, i.e. that each of those is wrong. I suspect this is part of the value of feedback for low-confidence answers. Do you know if there has been any work along these lines?

    1. Great question Jenny – I myself struggle with multiple choice questions and am particularly curious about other types of quiz questions that can easily test student knowledge without providing so many distractors. I also get too distracted by them!

    2. Great question Jenny. I cannot recall anything from the literature about this but will investigate further and get back to you if I find anything.

  2. In first-year undergrad maths courses, we would have weekly online quizzes to test our knowledge of the concepts learned during that week. This was essentially like homework, however, we were able to receive immediate feedback and could know instantly what we did wrong while the questions were still fresh in our minds. This enabled you to learn from your mistakes and work out where you went wrong while still focused on the topic, rather than receiving feedback from tutors weeks later after the class has already moved onto a different concept. This is particularly important when the lectures are building on previous knowledge, so if you did not properly grasp the foundation, unfortunately, you wouldn’t realize what was wrong until much later.

    I think it’s also a great idea how the quizzes not only benefit the students with nearly instantaneous feedback, but the lecturers can also use this as a tool to check-in with their students. In the future, I definitely think it will be a valuable tool to use, especially if there is some analytics for the lecturer after the students take the online quiz as it would enable us to have a better idea of what to address in the following weeks and what concepts the students are struggling with most. This would allow us to reorganize the lectures on the go so more time could be budgeted in for going over certain concepts again, particularly if they are to be built upon in the future lectures.

    1. Hi Sarah, great point about analytics – this is something we’ll look at in upcoming posts of this course, as a way to “feed-forward” to the teaching staff about where best to target. 🙂

    2. Sarah, I don’t use any complex analytics, I just have the Learning Management System display the quiz results for all students, sorted by mark. I can then triage the students into: A. those who did not do the quiz, or got a very low mark, and so need intervention (perhaps by referral to a specialist councilor), B. those who need some help I can provide, and C. those who need no help. If there are concepts students are generally having particular difficulty with, I will suggest extra learning materials in my next post to the class (I don’t use “lectures”).

    3. Thank you Sarah for sharing your experience with weekly online quizzes. I am interested to know whether these were optional or compulsory to do each week? I really value the adaptability that quiz analytics provide for your teaching – so that you can modify your teaching to address the needs of each class/cohort.

      1. Hi Alexandra, the quizzes were not “compulsory” per se, however in order to give us some incentive they would each be worth ~1% of our overall grade and as we would have around 10 quizzes for the subject it added up to 10% of our grade. So while if you didn’t do all of the quizzes it wouldn’t significantly affect your grade, however, I found them super helpful in giving my grade a bit of a boost and the instant feedback was great. 🙂

  3. I have used low stakes quizzes with automated marking, spaced throughout a course, for a few years. I use this as a way to encourage students to actually study the material, and as a quick check as to how they are doing. Last semester I set the deadline for each quiz just before the face to face workshop where students were to discuss the material.

    One approach I introduced with quizzes a few years ago was to have them not count towards high marks: only up to 70% (the top of a “Credit” at ANU). The idea is that the quiz is to help the student with the basics, and check they are competent, the advanced stuff will be assessed elsewhere. This is also to deter students from gaming the quiz, and to avoid creating worries for colleagues who don’t believe you can test advanced concepts with a quiz. I think quizzes could be used for advanced concepts, but I don’t want to have to argue about this at examiners’ meetings.

    1. Thank you Tom for sharing your experience of how you deliver quizzes in your educational settings.

  4. I’ve found it a challenge to mesh between the style of assessment I use (long form essay ) and quizzes in a way that keeps the mindset on the “interpret, apply, use” framework that fits my style of marketing teaching rather than the “recognise and recall” that fits most of the marketing quiz makers. Particularly since introducing written elements into a quiz takes out the benefits of automation, and adds a layer of lag between submission and result. Still working on how to use that aspect.

    Also, how to best fit a post-event / post-class quiz into the student experience to see if learning outcomes were hit, versus “Here is term, do you recognise the definition?”. Open to advice here

  5. Hi Stephen. I agree this can be a challenge. In my field of medical education it is a common topic of discussion – harnessing the benefits MCQ whilst ensuring that you are testing students at higher cognitive levels. There are a number of medical education papers that cover this – I have included a few here that are open access and have practical examples:
    Assessment of Higher Ordered Thinking in Medical Education: Multiple Choice Questions and Modified Essay Questions –
    How to Write a High Quality Multiple Choice Question (MCQ): A Guide for Clinicians. –

    Another option to encourage students to work at these higher cognitive levels is to give them the task of writing MCQs with feedback. They can be disseminated to the class for peer learning and I have heard of some teachers selecting the high quality questions for formative and summative assessments as an incentive.
    Medical students create multiple-choice questions for learning in pathology education: a pilot study –

    Another option that is especially relevant to preparing students for the long form essay is to provide short answer questions. Whilst this type of question is also commonly associated with the examination of lower cognitive levels, the questions can be phrased to test higher cognitive levels. This can be used to provide students with an opportunity to practice thinking and writing at these higher cognitive levels to prepare them for writing an essay in the examination. The short answer questions could even be a ‘chunked’ version of an essay, structured and linked in such a way that they build the components of an essay. For this type of question I typically use an online tool that enables students to enter their answer and check it against a model answer (and feedback where relevant). And I can view the student responses collectively to provide additional feedback if needed.

  6. I found the “tour” very confusing. I felt like a rookie who had failed to do some essential homework and was now up on the stage under a spotlight being interrogated.

    I have used single choice (radio button) for introductory quiz questions to put the students at ease, before moving on to the multi-selector, as they are a bit harder, then matching which are even harder.

    I seemed to be missing something, as the notes jumped straight into “Label the anatomy of this multiple choice question”, without first explaining the terms. I assume a “distractor” is an incorrect answer designed to look plausible, but what is a “key”?

    Then there was mention of two “presentations”, but I was not clear what these were: “Let’s Re-Choice” and “Mind the Gap”? In what way are these presentations?

    What is a “single choice set”?

    What is an “MC Summary”? The explanation did not make much sense to me: “Choose the correct statement to build up a collection of true statements.”

    What was the list of items under this, with numeric codes?:

    Distractor 1.3
    Correct statement 1
    Distractor 1.2
    Distractor 1.1

    In “Mind the Gap”, I did not understand where the words for the “Typing Cloze (Fill in the Blanks)” were supposed to come from. It says “Type the words provided (right) into the gaps.”, but could not a list of words to the right.

    It may be that my web browser is not displaying the complete web page, or perhaps I missed a previous lesson.

    1. Hi Tom,
      I’m sorry to hear that you had trouble in Day 2. It is definitely not our intention to make anyone feel under the spotlight. A great beauty of non-assessed instantly-marked quizzes and questions (like the ones in this course) is that students can try things out in private without fear of embarrassment. In fact, a key message in this course is that we can offer our students practice tests to have a go, learn from their mistakes and feel more prepared for their high-stakes assessments.

      We are pioneering the use of interactive H5P elements for Coffee Courses in this course so we realise it is tricky for those who are not familiar with the terms H5P use. The names, ‘Presentation’, ‘Single Choice Set’, and ‘Summary’ are all names that H5P use for their activity types so that is why we have used those terms. The ‘Presentation’ is H5P’s version of a PowerPoint presentation but it can also include interactive elements to engage the user. It is also possible to make ones with audio as well.

      I made a conscious decision to use the labelling question instead of defining the terms because I thought everyone would be able to reason it out and learn the words that way but I’ve taken your comments on board and will add some wording for those who would like more explanation.

      The purpose of the ‘Let’s Re-Choice’ presentation was just to quickly go through the more common multiple choice (MC) question types because I didn’t want to hold people up by having to give too much thought to answering the example questions. I also wanted to reinforce the use of the terms ‘key’ and ‘distractor’ but I have taken on board that you found them confusing so I will update them too.

  7. “It is recommended that quizzes are used for low stakes testing and do not significantly influence course grades.”
    Regarding formative quizzes v. low stakes summative:
    We run two large online courses based on units of study material with MCQs to check immediate understanding. In one of them, the grades for the MCQs make up a very small contribution to the final mark – so small as to be relatively negligible in the calculation in fact. In this course, practically all the students complete them. In the other, the MCQs are optional formative assessments, and only just over half the students actually make use of them. Maybe it would be a good ‘nudge’ to add them to the grading of the other course too…

    1. Hi Sonia, welcome! Great suggestion there about a “nudge”! On the Day 2 post for the course, Sarah made a similar comment around very small marks being a reward/incentive to complete formative quizzes. I wonder what the ideal grading options are?

      Alex – do you have any suggestions around how much grading is helpful for formative assessments? I recall in the medical school context there are quite a few formative assessments?

    2. Hi Sonia
      Anecdotally I have heard from many colleagues (after some trial and error) that low stakes summative is optimal for maximising participation – which fits with your experience. I am not familiar with what the literature says on this though – I must find out! In Day 4 of this coffee course low stakes summative quizzes is one of the suggestions to encourage students to use quizzes (see Incentives). Thank you for sharing your experience.

  8. We have 3 spaced out summative quizzes, worth 10% each, for our students. The quizzes are a mix of auto-markable/one-correct-response questions, short answer questions, and one extended answer (approx 4 sentences) question. As the class size grew, we moved these from paper-based in-class to online. This has made things more equitable for students, as we have the tests open for a 24-hour period, with students having only 30 minutes once they start their attempt.

    However, it has increased my workload. To address potential collusion, since the quizzes are no longer invigilated, we decided to create a randomised database. This means more questions to create and mark. While I had hoped auto-markable questions would reduce my workload, I found I still had to manually check them. MCQs with multiple responses would mark students who selected more than the correct responses as still getting full marks. This meant students could potentially game the system by just ticking everything. Similarly, the system would mark answers as being incorrect if they didn’t exactly match the suggested response. So, I still had to manually check to see whether a student had misspelt something, or added additional information. Finally, while the quizzes were in-class and invigilated, the questions tended to be lower-order recall type questions. However, we quickly learnt after the first online quiz that students were simply copy-pasting responses from their readings or Google, without necessarily understanding and applying the concepts. For the following quizzes, we reformatted the questions to require higher-order thinking. For instance, instead of asking students to match concepts with definitions, we asked them to match concepts with other similar concepts, or which concept would apply in this scenario. I found coming up with higher-order questions that still fit the quiz format (and ideally were also auto-markable) quite challenging, and more time-consuming than the more “traditional” quiz questions.

    This year, we also introduced formative weekly quizzes as part of Wattle’s Lesson function. Our “nudge” was linking next week’s materials to the completion of this week’s quiz. Although students complained about this, they also stated that they found the weekly quizzes beneficial in focusing their learning and receiving constant feedback. In response to student feedback, we decided to remove the “nudge” and “unlock” course materials from week 9. Surprisingly, over 30% of students continued to engage with the formative quizzes. Moving forward, we need to re-assess what “nudge” to use. I am personally leaning towards making the formative quizzes low-stakes (1% each), and replacing one of the summative quizzes with the best 10/12 weekly scores. Or, perhaps making the first 2 weeks formative (as enrolment numbers are still fluctuating), and then count all of weeks 3-12. The former is more work for me, but usually better for students. However, always open to suggestions.

    1. Hi Bhavani,
      You’ve obviously put lots of thought into the best way to deploy quizzes in your course!
      Multi-select questions in Wattle (Moodle) Quiz are tricky. Understandably, people assume the system will automatically penalise the student according to what they did and didn’t select correctly but this is not the case. The best way to discourage students from selecting all options is to make all correct answers at up to 100% and all incorrect ones at up to -100% (they can’t get any less than a score of 0 for any quiz question).
      As you found out, short answer questions can also be problematic if students copy and paste into the answer box or accidentally enter punctuation or spaces that don’t match the expected answer. Manually marking these can be a pain so it definitely pays to make required answers very specific. You might find an ‘Embedded Answers (Cloze)’ (Gap-fill) question easier to get the exact (preferably one word) answer you’re looking for.
      From what I gather at ANU, even a tiny percentage awarded for each quiz is enough of a carrot to get students to do them. It is also quite common and straightforward to drop the lowest 2 or 3 quiz scores. You just need to go into Gradebook set up, put all quizzes into a category and edit the category settings to drop the lowest scores. Please let me know if you need help with this.

      1. Thanks Rowena! You have definitely helped fill gaps in my own knowledge. There are all these fantastic features that we never get around to learning, and thus, using.

  9. Hi everyone!
    I really enjoyed the quiz-making tools I used at my previous job at a language college overseas. They had designed their own HTML-templates that allowed teachers to create ‘hot potato’-style online exercises. I could tweak the templates to suit the exercise and integrate text, video, etc. They allowed different types of questions: fill in the gap, multiple-choice, drop-down menus, flash cards, drag-and drop, etc. Very versatile (at least, once you had acquired some soft coding skills)! But the problem is that they are not transportable; they only work with that school’s LMS. Which is why I enjoyed learning about free, open-access tools like H5P and Anki. Looking forward to the next module!

  10. I think I will add a motivational question to my week 1 quiz from above — to motivate the students to take the quizzes and space their learning.

  11. Hello all,
    I use low stakes weekly Wattle quizzes with automated marking as part of a flipped mode to teaching and learning. Students complete the preparation material and do the quiz before the tutorial. It provides them with initial feedback and offers tutors the opportunity to know where to pitch the session and how to focus the review before applying the knowledge in the activities. The quiz questions try to hit a range of Debono’s thinking levels ranging from initial and simple understand, recognise and recall to interpret, apply, etc.

  12. Hi Emmaline. That sounds fantastic. What % of students do you find routinely complete the quizzes each week? What contribution do the quizzes make to their end of course mark? Thanks for sharing how you use quizzes in your educational setting. Best wishes Alex

  13. Hi Alexandra. The quizzes make up usually 5% but sometimes up to 10% of the total course weighting. At least 95% of the students complete the quizzes. I run them to be due either at the beginning of the week (for preparation) and this semester also trialled running them after the tutorials (review).

  14. As a language student, I have taken many courses that have used quizzes. I my own teaching I have used quizzes previously during online delivery to increase participation and gauge existing knowledge during icebreaker activities. I would be interested in exploring options for using pre-class quizzes to check comprehension of set readings in humanities courses. This might also be an effective way make sure students do complete the readings.

  15. Thank you Alison for your comments related to your own teaching and experience as a student. Pre -class quizzes would be a great complement to set readings and a great way to prime students prior to in-class discussion and ensure that the key content/concepts/issues are highlighted.

Leave a Reply

Your email address will not be published. Required fields are marked *