in collaboration with Dr Jasmine Meysman, Faculty of Applied Engineering, University of Antwerp

Introduction

In recent years, the focus on blended learning and distance learning has increased significantly, which has also created a need for other forms of assessment. Open-book exams, where students are allowed to use their own learning materials or resources during an exam, are an interesting option to consider in this regard.
In this teaching tip, we will first discuss when an open-book exam makes sense. We will then formulate practical design and assessment tips while also discussing the impact of artificial intelligence tools, which are becoming more widely available.

When does it make sense?  

In open-book exams, the emphasis is on demonstrating understanding, critically processing, interpreting and assessing information and formulating reasoned judgments (KU Leuven, 2020). Open-book exams require students to apply knowledge and skills, whether in real-life scenarios or in realistic cases. They allow conceptual understanding and higher-order skills – such as applying, analysing, synthesising and assessing – to be tested through more complex tasks (Deneen, 2020). It is therefore important, when opting for an open-book exam, that such final competences are at the forefront in your programme component (Parker et al., 2021; ECHO Teaching Tip: ‘Meet wat u moet meten’, 2013). 

Performing complex tasks that require interpreting and synthesising information from different sources within a (strict) time limit is a realistic professional challenge. In this sense, open-book exams are well suited to assess students' ability to function in a simulated professional context (Deneen, 2020). 

Open-book exams can also provide a solution for remote assessment (Deneen, 2020). When students are not (or need not be) restricted in the use of resources and their own learning materials, strict monitoring is less relevant. This also means that students do not necessarily have to take their exams on campus; this can (also) be done from home. The exam is then called a 'take-home exam'. Such take-home exams differ from ordinary group work or assignments in that they must be completed within a strict and short time frame. This can range from several hours to several days (Deneen, 2020; Bengtsson, 2019).

Testing based on an open-book exam can give students a false sense of security, causing them to spend less time studying for the programme component and consequently have lower retention of the subject matter. Inadequate preparation puts them at risk of spending more time searching for answers during the exam than formulating them (Green et al., 2016). You can partly solve this by focusing heavily on teaching higher-order skills during your lectures (Green et al., 2016).

Design tips

When designing the exam questions, keep in mind that this is an open-book exam: avoid questions where the answer can be found directly in the subject matter. Develop exam questions where students can demonstrate that they have acquired critical understanding of the subject matter, can apply this understanding and can reflect on the learning content (Heriot Watt University, 2023).

Open-book exams can, in principle, consist of both open-ended (essay) questions and closed (multiple-choice, multiple-answer) questions. The questions can be based on existing cases or realistic scenarios. Having the students interpret data is another possible option (Bredon, 2003). In any case, be sure to provide sufficient background information. Or – in the case of an open web exam, for instance – have students search for additional info independently as part of the exam. If you choose to do this, adjust the time limit accordingly.

Communicate your expectations with regard to the exam clearly to your students: go over sample questions during your lectures, explain how students will be assessed, and provide clear instructions for the exam itself (Heriot Watt University, 2023).

Assessment tips

Although an open-book exam does not necessarily have to consist of extensive open-ended questions, it usually does. This is due to the fact that mainly higher-order skills are tested (see above). It entails that assessment is often quite time-consuming and not black and white (Deneen, 2020; Bengtsson, 2019).

Two useful assessment tools for open-book exams are checklists and rubrics. Checklists list all the topics, operations, concepts or other elements that a student must have mentioned or used in the answer to a question. When a student has correctly included a requested element in the answer, it is ticked off the list and the student scores a partial point on this question (Srinivasalu, 2016). Rubrics are two-dimensional assessment guides that use sets of expectations to assess the student. Rubrics allow for the nuanced evaluation of students on different competences according to different quality grades (Srinivasalu, 2016; ECHO Teaching Tip ‘Rubrieken als begeleidings- en beoordelingsinstrument’, 2017).  

When assessing open-book exams, a combination of a checklist and a rubric is often used. Make sure that both tools are of high quality, logically constructed, and aligned with the final competences of the programme component. The checklist is used to assess the content-related aspects of the answers, while the rubric is used to evaluate the predefined higher-order skills. You could communicate a general rubric to students in advance.

One of the biggest disadvantages of an open-book exam is the increased risk of fraud: students could commit plagiarism (ECHO Teaching Tip: ‘Plagiaat’, 2013), work together – in the case of a take-home exam – or engage third parties, who might even specialise in taking exams for a particular subject. Some tips to combat fraud (Bengtsson, 2019):​

  • Ask students to refer directly to the teaching material in their answers. Penalise missing references and/or use plagiarism detection software.
  • Print your exam with a watermark. For an online exam, use a secure browser that prevents copying, saving or printing of exam questions. Check possible cohort fraud based on statistical indicators.
  • Provide a tight time limit to submit the exam, so that students cannot afford to spend time on anything other than completing the exam.
  • Plan additional oral commentary sessions during which you can gauge whether the students have actually mastered the material.
  • Design questions so that they require a thorough knowledge of the course material. This increases the cost of outsourcing the exam, which will discourage students from doing so.
  • Allow students to use certain programs and online resources, but block or restrict general internet access (e.g. via specialised software such as Avidanet).

Implications of AI tools

As an experiment, we asked ChatGPT about the pros and cons of using AI tools for open-book exams. Artificial intelligence tools can facilitate open-book exams (and, by extension, student learning) (ChatGPT, 2023):

  • By using AI tools, students can generate initial ideas for answers faster than by manually browsing books and resources. Students can then use the time gained to arrive at in-depth and well-structured answers to complex questions that require in-depth, personal analysis and reasoning.
  • By allowing students to use AI tools during their open-book exams, you encourage them to critically navigate a multitude of information and select relevant (partial) answers from it. It is obviously important, then, to teach the skills required to critically assess and use information from sources such as ChatGPT in advance (in the relevant programme component and/or throughout the study programme). Students should be trained to distinguish reliable and relevant information from less useful sources.
  • Being allowed to use AI tools in an open-book exam makes students aware of the potential of such technology and how it can be used as a tool for solving complex problems, not only during their studies, but also throughout their later careers.

However, the wide availability of AI tools can also pose a challenge in open-book exams (ChatGPT, 2023). There is a risk that students copy-paste answers directly from an AI tool without really understanding the subject matter. This obviously undermines the purpose of the exam. For assessors it is not always clear whether an answer was crafted by the student, based on deeper insight, or simply generated by an AI tool. This can lead to subjective assessments and incorrect results.

In short, it is important to strike a balance between allowing AI tools to be used in open-book exams on the one hand and ensuring deep student understanding and fair assessment on the other (ChatGPT, 2023). This can be done by making your exam questions sufficiently complex and specific, and by going over guidelines with the students beforehand, clarifying whether and how they can/may use AI tools in your open-book exam. Be sure to take into account the policies and general framework regarding AI and scientific integrity as applicable within your institution. (For UAntwerp staff: Beleidslijnen UAntwerpen met betrekking tot AI en wetenschappelijke integriteit, login required)

Want to know more?

How to get started: specific tips on how to draw up open-book exams

C. Deneen, “Assessment considerations in moving from closed-book to open-book exams.” The University of Melbourne, p. 6, 2020, [Online].

Heriot Watt University, “Assessments: Creating a take-home exam,” p. 4, [Online]

J. B. Williams, “Creating authentic assessments: A method for the authoring of open book open web examinations,” pp. 934–937, 1991

Beleidslijnen UAntwerpen met betrekking tot AI en wetenschappelijke integriteit (accessible only to UAntwerp staff, login required)

ECHO Teaching Tips (in Dutch)

Meet wat u moet meten

Plagiaat

Relevant literature

L. Bengtsson, “Take-home exams in higher education: A systematic review,” Educ. Sci., vol. 9, no. 4, 2019, doi: 10.3390/educsci9040267.

B. Bloom, M. Englehart, E. Furst, W. Hill, and D. Krathwohl, “Taxonomy of educational objectives: The classification of educational goals,” in Handbook I: Cognitive domain, New York, Toronto: Longmans, Green, 1956.

S. Bloxham and P. Boyd, “Developing Effective Assessment In Higher Education: A Practical Guide,” no. January 2007, p. 16, 2007, [Online].

R. Brightwell, J.-H. Daniel, and A. Stewart, “Evaluation: is an open book examination easier?,” Biosci. Educ., vol. 3, no. 1, pp. 1–10, 2004, doi: 10.3108/beej.2004.03000004.

G. Bredon, “Take-Home Tests in Economics,” Econ. Anal. Policy, vol. 33, no. 1, pp. 52–60, 2003, doi: 10.1016/S0313-5926(03)50004-2.

A. Gharib, W. Phillips, and N. Mathew, “Cheat Sheet or Open-Book? A Comparison of the Effects of Exam Types on Performance, Retention, and Anxiety,” J. Psychol. Res., vol. 2, no. 8, pp. 469–478, 2012, doi: 10.17265/2159-5542/2012.08.004.

S. G. Green, C. J. Ferrante, and K. A. Heppard, “Using Open-Book Exams to Enhance Student Learning, Performance, and Motivation,” J. Eff. Teach., vol. 16, no. 1, pp. 19–35, 2016.

M. K. Ioannidou, “Testing and life-long learning: open-book and closed-book examination in a university course,” Stud. Educ. Eval., vol. 23, no. 2, pp. 131–139, 1997.

KU Leuven, “Onderwijslexicon Open boek examen / gesloten boek examen,” 2022.

M. D. Meeks, F. Williams, T. L. Knotts, and K. D. James, “Deep vs . surface learning : An empirical test of generational differences,” Int. J. Educ. Res., vol. 1, no. 8, pp. 1–16, 2013, [Online].

L. Myyry and T. Joutsenvirta, “Open-book, open-web online examinations: Developing examination practices to support university students’ learning and self-efficacy,” Act. Learn. High. Educ., vol. 16, no. 2, pp. 119–132, 2015, doi: 10.1177/1469787415574053.

A. M. Parker, E. Watson, N. Dyck, and J. P. Carey, “Traditional versus Open-Book Exams in Remote Course Delivery: A narrative review of the literature,” in Proceedings 2021 Canadian Engineering Education Association (CEEA-ACEG21) Conference, 2021, pp. 1–7.

G. N. Srinivasalu, “Open book evaluation System to improve the cognitive and analytical skills of the students of B.ED. in social science,” J. Educ. Res. Ext., vol. 53, no. 4, pp. 46–51, 2016.

L. Suskie, “Assessing student learning: A common sense guide”, San Francisco, CA: Jossey-Bass, 2018.

J. B. Williams, “The place of the closed book , invigilated final examination in a knowledge economy,” Educ. Medio Int., vol. 43, no. 2, pp. 107–119, 2006, doi: 10.2139/ssrn.1606343.


Lees deze tip in het Nederlands