From heart attacks to skin infections, how AI led to wrong diagnosis

Artificial Intelligence (AI) tools like ChatGPT are increasingly being used by people seeking quick medical advice but Dubai doctors warn that self-diagnosing through AI can do more harm than good. Several physicians have shared alarming cases where patients relied on AI-generated responses, leading to delayed or wrong treatments.
Dr Alok Chaubey, Medical Director and Specialist General Surgeon at Prime Medical Center, Oasis Mall branch, said ChatGPT can be an educational tool for trained professionals, but a risky one in the hands of laypersons.
“There’s a difference between gaining medical education and becoming a professional. It takes years of study, experience and clinical judgment to diagnose correctly. ChatGPT can be a good tool in the hands of medical professionals, but a sword in the wrong hands can cause serious harm.”
Dr Chaubey cited an example from his own experience, where a patient experiencing tingling, numbness, and leg pain was advised by ChatGPT to take gabapentin, which is a controlled drug.
“The actual treatment was a just vitamin B12 and D deficiency,” he said.
Dr Azeem Irshad, Specialist in Internal Medicine at Aster Clinic, Al Nahda, noted that AI platforms have opened “remarkable opportunities” for health education and patient engagement but warned they can never replicate the clinical reasoning of a trained physician.
“I had a patient who self-managed ‘chest tightness’ based on online AI suggestions with antacids (used to relieve symptoms of heartburn and indigestion) and was later presented with an evolving myocardial infarction (commonly known as a heart attack). Had the patient not sought timely medical attention, the outcome could have been life-threatening. Fortunately, early consultation and intervention prevented a heart attack and potentially saved the patient’s life,” Dr Irshad said.
“I’ve seen individuals delay seeking care for persistent fever after reading that it might be viral, only to be diagnosed later with typhoid or an autoimmune disease.”
He added that AI suggestions often generalise symptoms, missing nuances like coexisting conditions or subtle physical signs that doctors pick up during examination.
“Also, some patients who initially relied on AI tools for prolonged fatigue or abdominal discomfort, attributing it to lifestyle or stress, were later diagnosed with conditions such as hypothyroidism or inflammatory bowel disease, respectively,” Dr Irshad said, noting that the safest approach is “AI-assisted, doctor-led care”, where technology supports but never substitutes professional judgment.
Dr Nishit Bodiwala, Specialist Dermatologist at Prime Medical Center, said dermatology has seen an influx of patients whose skin conditions worsened after self-treatment suggested by AI.
“A 38-year-old man had severe itching in his groin and inner thighs. ChatGPT told him it was contact dermatitis and suggested cortisone cream. It was a fungal infection,” he said.
“In another case, a 44-year-old woman used oral steroids for flu-related hives after reading AI advice, which worsened her infective urticaria. She actually needed antibiotics and flu treatment.”
Dr Bodiwala stressed that accurate diagnosis in skin diseases relies on physical examination and the doctor’s experience.
“The role of ChatGPT in dermatology is supplementary, but it cannot replace a clinician’s expertise, physical examination, or detailed patient history, all of which are essential for accurate diagnosis and effective treatment.”
Doctors agree that tools like ChatGPT can help patients understand symptoms, prepare better for consultations, and promote awareness, but using AI as a substitute for medical care can be dangerous.
As Dr Irshad summed it up: “AI can inform, but it’s the physician who interprets and heals.”
Sign up for the Daily Briefing
Get the latest news and updates straight to your inbox
Network Links
GN StoreDownload our app
© Al Nisr Publishing LLC 2025. All rights reserved.