Please register to access this content.
To continue viewing the content you love, please sign in or create a new account
Dismiss
This content is for our paying subscribers only

Business Analysis

Comment

Don’t straitjacket Generative AI as only helping cheat at essays

Educators should not rush into judgements into what ChatGPT and its kind can do



What ChatGPT and Generative AI need is a more objective scrutiny by the education sector. It's not all about students cheating on essays.
Image Credit: Shutterstock

Generative AI - the technology that uses machine learning algorithms to create original content as text, images and videos - has been a topic of much discussion in the education sector.

Many praise its potential to personalize learning and reduce costs, while others wax concern about the biases it may perpetuate, increasing academic dishonesty and the potential disruptions it may cause in the classroom. Is the education sector reading too much into generative AI and its likely disruptions at this early stage?

This is not a new risk to academic integrity operationally, but rather a technological advance of an existing risk. In the past, students could hire a ghostwriter to compose their assignments. With ChatGPT, they can instruct AI instead of a person, at no cost, with immediate turnaround depending on the output required.

Extract best use case from Generative AI

Cases where students submit a well-written report but struggle to answer even basic questions in a viva have increased. Academics have developed authentic assessments to mitigate the risk of ‘essay mills’ and should further develop and use this knowledge within assessment design to meet the challenges posed by AI.

Instead of viewing ChatGPT as a cheating tool, educators could embrace it as a teaching tool. It can provide basic answers quickly, freeing up class time for discussion, exploration, and critical pedagogy, which develops the higher-order thinking skills that will keep students ahead of AI technology.

Advertisement

Certain AI-generated content may include factual errors - or ‘hallucination’ - and improper referencing. While these issues may be recognizable to educators, they could be missed by other software systems. AI technology may be capable of replicating some aspects of human experience within a module, but cannot replace the dynamic of group work or the peer pressure and recognition that impose powerful psychological forces to motivate students to engage and contribute.

Managing or restricting the use of source material in assessments can also help reduce the potential for plagiarism by specifying where information can be drawn from.

Reflecting biases

There are valid concerns that these systems are trained on data that may reflect biases within society, which could result in those becoming embedded in the AI itself. Experts fear that Generative AI could become a substitute for teachers, leading to a reduction in the quality of education and the loss of jobs in the profession.

On the other hand, there are many potential benefits, helping to personalize learning by creating individualized lesson plans tailored to students’ needs and interests. This could lead to improved learning outcomes and a more engaging learning experience for students.

Further, Generative AI could help reduce the cost of education, and create content more efficiently and at a lower cost than human writers or designers. This could help make education more accessible to students who may not have the resources.

Advertisement

It is important to approach Generative AI with a critical eye, while also exploring its potential to improve education outcomes. Academic dishonesty and AI hallucination are two issues that must be addressed to ensure AI is used responsibly. Educators must ensure they are not reading too much into AI’s potential and continue to prioritize critical thinking, analysis, and creativity in teaching methods.

Dr. Noor Ulain Rizvi
The writer is with Curtin University Dubai.
Advertisement