Session 30: What happens to academic integrity when we allow students to use GenAI for their assessment?

Authors

  • Olatunde Durowoju Liverpool John Moores University, Faculty of Business & Law

Abstract

The reluctance of academics to embracing students' use of Generative AI (GAI) in Higher Education (HE) primarily stems from concerns about preserving academic integrity. The advanced capabilities of GAI, particularly its capacity to generate content upon request, challenge the conventional preparatory processes (research and writing) integral to students' assessments. Within the realm of academic integrity, the cornerstone principle of 'ownership of work' demands that students personally engage in research, distinctly clarify contributions made by others, and present their findings in a satisfactory manner. This poses a challenge for institutions that rely on the traditional expectation of students independently producing every aspect of their work to assure the quality of their graduates. However, the essential question remains: does the integration of GAI for assessment purposes entirely compromise the quality that stems from students' independent efforts? To address this query and scrutinize the definition of 'ownership of work' within the framework of academic integrity, we conducted action research. Students were tasked with completing research and writing assignments using prompt engineering with ChatGPT, followed by a debriefing session to explore their experiences and perceptions. This study seeks to shed light on the nuanced relationship between GAI, academic integrity, and the quality of work produced by students in Higher Education.

Published

2024-07-18

Issue

Section

Presentations