Panelists say the ability to question and judge content is now education’s top priority
AI can be a powerful tool in education, but overreliance on it may undermine critical thinking, panelists cautioned during a session titled Mind over machines: Teaching students to outthink AI at Gulf News Edufair.
While AI makes information more accessible than ever, educators agreed that universities must help students develop the skills to judge, analyse, and reflect on content, ensuring they can question both human- and machine-generated information.
The ability to discern accuracy, bias, and intent, educators said, must now be seen as a core literacy. Developing such judgment calls for deliberate teaching strategies, reimagined assessments, and well-designed toolkits that train students to think beyond what AI produces.
Building critical thinking in the AI era requires more than instruction – it demands structure and practice.
“We can’t simply preach. We need to go beyond preaching in universities,” said Dr S. Sudhindra, Pro Vice Chancellor at Manipal Academy of Higher Education (MAHE Dubai).
“What is required is a toolkit – tools that enable students to reflect and question a natural part of the learning process. Otherwise, they may over-rely on AI instead of adopting a better way of thinking.”
At Middlesex University Dubai, such toolkits extend beyond course content.
“When you talk about the toolkit, it cannot be limited to content alone. It has to cover how you structure that content, including assessments,” said Jaspreet Singh Sethi, Senior Lecturer in Computer Engineering and Informatics at Middlesex University Dubai.
“All of these need to change because AI can already recall facts better than us. We need to scaffold teaching so that students can leverage AI to cover the basics, and then reflect on what AI generates and add a human perspective.”
Dr Nidhi Sehgal, Head of School – Business & Humanities at Curtin University Dubai, said, “AI can certainly help students produce answers – they are already doing that – but the real challenge is to make it unavoidable for them to interrogate those answers. This means a major transformation in the way we teach, the way students learn, and even in our resources, toolkits, and assessments.”
All three educators agreed that reflection must be built into daily classroom practice. “As individuals using AI, there is often uncritical trust in the information it produces. We need to teach students how to question that,” said Sethi.
Reflective thinking, he added, can be encouraged through short, practical activities, such as peer reviews, draft evaluations, or guided discussions that compel students to pause and rethink their assumptions.
Dr Sudhindra explained how simple tools can foster deeper reflection. “Processes such as journaling or engaging in critical discussions with peers bring important questions to the table,” he said.
“We have to make students do this deliberately, through tools like journals or peer reviews. These practices force them to step back, reflect, and ask the right kind of questions.”
For Sehgal, reflection also changes how learning is measured. “When I look at student work, I’m less interested in what they wrote and more in how their learning and thinking changed during the process,” she said.
One classroom technique she uses is the “3-2-1” activity - three insights, two doubts, and one example” – after an AI-assisted task, prompting students to move beyond reproduction to reasoning.
Educators cautioned that AI can appear objective while actually reinforcing users’ biases.
“AI has a tendency to provide more of whatever conforms to your own point of view because it is being trained on your inputs,” said Dr Sudhindra.
“That is why students need to learn how to seek information from a neutral standpoint, rather than assuming the answers.”
Sethi added that understanding bias and hallucination is foundational to AI literacy.
“The first question is whether students even understand the difference between the two,” he said. Middlesex University Dubai has introduced an in-house AI literacy course to address this. “The course ensures they gain foundational knowledge of how AI works, what data it draws on, and how it generates content. Only then can they identify bias and question the accuracy of information.”
Sehgal highlighted that this awareness must shape student behaviour. “AI is like the traditional first-bencher. It will always produce an answer, even if it’s wrong,” she said. “That’s why I ask my students to treat every AI-generated output as a hypothesis, not a claim.”
Panelists highlighted that broad, interdisciplinary exposure is key to judgment and balance. “Interdisciplinary learning depends on the curriculum – not every degree programme allows it. That’s where human-to-human engagement becomes important,” said Sethi.
“Guest speakers, open forums, and competitions where interdisciplinary teams work together can help students broaden their perspectives.”
Sehgal said that this understanding must be woven across disciplines. “Training students to outthink AI often happens in one-off workshops, but, that’s not enough. It needs to be integrated into the curriculum, across disciplines,” she said. “This is not just about technical knowledge. It involves epistemology – how we know – and psychology – how we think.”
Ultimately, the educators agreed, teaching itself must evolve. “The entire education system needs to reshape itself to align with this new reality,” said Sehgal.
“We need to raise students as critical thinkers. They should not be naive believers that AI is always right, nor cynical disbelievers. They must learn to strike the right balance.”
Sign up for the Daily Briefing
Get the latest news and updates straight to your inbox