From trivia games to final exams, quizzing tools have a variety of uses for learning as well as assessment. Exams and quizzes have a particularly plentiful range of possibilities in a multimodal or hybrid course, where they can be administered synchronously or asynchronously. Research suggests that the presentation of a tool influences student behavior in response to the tool. In comparing two student discussion boards, one an ungraded discussion and one a graded replacement for a final exam, Cheng et al. (2013) found that students displayed more knowledge on the graded board, but more evidence of learning on the ungraded board. The students who participated in the study were more likely to grapple with new ideas when the stakes were low, but more eager to showcase topics they were confident about when their responses would have a greater impact on their grades. When considering quizzing tools, then, we recommend allowing your course goals to guide your usage.
Below, you will find tips you can use to achieve a range of goals in a multimodal course, from measuring knowledge to fostering learning and even facilitating play.
Types of Quizzes
There are myriad ways you can use exams and quizzes in your multimodal course. The list below is not comprehensive, but it is designed to give you a sense of the range of options and use cases that exist.
Summative Assessments: Perhaps the most obvious option, quizzing tools are frequently used for final exams and other summative assessments. Selected response items (e.g., multiple choice question) are an efficient mechanism for assessing memorization of large amounts of information, while constructed response items (e.g., short response and essay questions) more effectively measure higher-level thinking (Rudolph et al., 2019).
Asynchronous Formative Assessments: Quizzing tools can also be used for formative assessments that help students learn and prepare for summative assessments (Davis, 2011; Rausch and McKenna, 2020). If you plan to use quizzing for a summative assessment, consider incorporating similar question types in earlier formative assessments, as familiarity with question format positively impacts performance on exams (Rudolph et al., 2019).
Synchronous Formative Assessments: To increase engagement in synchronous sessions, you can incorporate time for students to take practice quizzes and discuss their problem-solving methods and approaches in small groups. Some studies suggest that this practice of collaborative quizzing can improve final exam performance (Rezaei 2015), while others point to increased student motivation and perception of learning (Burgess and Medina-Smuck, 2018; Clinton and Kohlmeyer 2005).
Preparation for Synchronous Activities: If you are planning a synchronous activity, you can use quizzing tools to assess prior knowledge, review concepts, and/or generate questions for live discussions or guest speakers in advance. Research suggests that this kind of preparation enhances the synchronous experience (Herreid & Schiller 2013; Ji et al., 2021).
Follow-Ups to Asynchronous Content: You can also use graded or ungraded quizzes to follow up asynchronous material (readings, lecture videos, text on lecture pages, etc.). Research suggests that not only the learning measured by an exam, but also learning itself, can change depending on question type (Jensen et al., 2014; Wright et al., 2018). Careful question creation can help students identify important themes and retain information more readily.
Quiz Considerations
When you are designing exams and quizzes for your multimodal course, you might wish to keep the following in mind:
The context of an assessment (which includes, but is not limited to, the setting, modality, and perceived stakes of the assessment) can impact learning outcomes. When you are constructing quiz or exam questions, be thoughtful about the kind of learning the assessment needs to measure and frame the exam accordingly.
Asynchronous assessments do not lend themselves to real-time assistance/adjustments. In a synchronous classroom (a lecture hall, say), exam takers can ask clarifying questions and instructors can make mid-exam adjustments. In an asynchronous environment, students don't have access to their instructor while they are taking their exam. For asynchronous assessments, then, it is especially crucial to use question types that contain all of the information that is needed for students to find the correct answers. You might also consider giving everyone credit for questions with confusing stems or answers.
Asynchronous assessments can raise questions about plagiarism or dishonesty. When designing asynchronous assessments, then, consider how you can build academic integrity into the fabric of your course. If you want to change your exam each term, but you also want to limit your edits/revisions, consider having students answer the same questions about a different reading or apply the same analysis to a different data set.
In a multimodal classroom, clarity, consistency, and timely feedback are key. Swan (2003) suggests that learning outcomes are positively impacted by quick feedback and clarity and consistency in course design and negatively impacted by difficult and confusing interactions with technology. In a multimodal course, where students are expected to engage in multiple types of learning, it is especially important to establish consistent uses and cadences for quizzing and to give clear instructions and timely feedback, to offset the cognitive load that is, to some degree, inherent to hybrid learning.
References
Burgess, A., & Medina-Smuck, M. (2018). Collaborative testing using quizzes as a method to improve undergraduate nursing student engagement and interaction. Nursing Education Perspectives, 39(3).
Clinton, B. D., & Kohlmeyer, J. M. (2005). The effects of group quizzes on performance and motivation to learn: Two experiments in cooperative learning. Journal of Accounting Education, 23(2), 96–116.
Cheng, A.-C., Jordan, M. E., & Schallert, D. L. (2013). Reconsidering assessment in online/hybrid courses: Knowing versus learning. Computers & Education, 68, 51–59. https://doi.org/10.1016/j.compedu.2013.04.022
Davis, K. A. (2011). Using low-stakes quizzing for student self-evaluation of readiness for exams. 2011 Frontiers in Education Conference (FIE), F3D-1. https://doi.org/10.1109/FIE.2011.6142954
Everspring. (2022, September 29). Academic integrity in assessment. Envision by Everspring. https://www.envision.everspringpartners.com/plan/academic-integrity-in-assessments.
Herreid, C. F. and Schiller, N. (2013). Case studies and the flipped classroom. Journal of College Science Teaching, 42(5). 62–66.
Jensen, J. L., McDaniel, M. A., Woodard, S. M., & Kummer, T. A. (2014). Teaching to the test…or testing to teach: Exams requiring higher order thinking skills encourage greater conceptual understanding. Educational Psychology Review, 26(2), 307–329. https://doi.org/10.1007/s10648-013-9248-9
Ji, H., Jain, P., & Axinn, C. (2021). Student perceptions of guest speakers in strategic communication courses. Journal of Public Relations Education, 7(1). 40-79.
Rausch, T., & McKenna, K. (2021). Low Stakes Quizzing: A Tool for Practice Not Assessment. American Association for Adult and Continuing Education.
Rezaei, A. R. (2015). Frequent collaborative quiz taking and conceptual learning. Active Learning in Higher Education, 16(3), 187–196.
Rudolph, M. J., Daugherty, K. K., Ray, M. E., Shuford, V. P., Lebovitz, L., & DiVall, M. V. (2019). Best practices related to examination item construction and post-hoc review. American Journal of Pharmaceutical Education, 83(7), 1492–1503. https://doi.org/10.5688/ajpe7204
Swan, K. (2003). Learning effectiveness: what the research tells us. In J. Bourne & J. C. Moore (Eds) Elements of quality online education, practice and direction (pp. 13-45). Sloan Center for Online Education, 13-45.
Wright, C. D., Huang, A., Cooper, K., & Brownell, S. (2018). Exploring differences In decisions about exams among instructors of the same introductory biology course. International Journal for the Scholarship of Teaching and Learning, 12(2). https://doi.org/10.20429/ijsotl.2018.120214