AI does not know your medical history and can invent treatments
Hold on before you ‘Chat GPT’ diet tips.
You just might end up with a 19th century illness.
Recently, a man in the US turned to ChatGPT for a salt substitute—only to receive a startling recommendation: sodium bromide, a chemical more commonly found in swimming pools than on dinner tables. With no history of psychiatric or serious medical issues, he took the advice seriously, using sodium bromide for three months. He ordered it online, hoping to remove chloride from his diet, inspired by studies he’d read on sodium intake and its potential health risks. He ended up in hospital for three weeks, after hallucinations, paranoia, facial acne and fatigue.
But this isn’t the only case. Even a quick scroll through Reddit shows that many users caution ChatGPT is trained to be “convincing rather than correct,” meaning some details are likely inaccurate and it’s safer to rely on a doctor’s guidance. “It can be useful for researching symptoms or generating questions to ask a healthcare provider, but before acting on any advice, it’s important to verify information with a professional,” one user noted.
Another pointed out that while the AI can sometimes provide explanations that are more detailed than what a physician might offer, it can also hallucinate, inventing entirely new conditions or fabricating citations.
Overall, it is best treated as a helpful supplement rather than a substitute for professional medical guidance, one user concluded. “I've tested chatgbt quite a bit in medicine and it does work most of the time but the other times it just makes stuff up. And you can't tell what's right from wrong unless you're a doctor yourself. So I would use it to know about a specific thing in medicine but I would not recommend using it diagnose yourself,” they added.
These anecdotal experiences are backed up by research.
In 2024, researchers put ChatGPT-3.5 to the test with 150 medical cases, complete with patient histories, symptoms, and hospital test results, asking the AI to provide diagnoses and treatment plans. The outcome wasn’t impressive. ChatGPT delivered the correct diagnosis and treatment plan in only 49 per cent of cases, highlighting its unreliability as a medical tool. The study’s authors noted that despite the AI’s vast training data, it “does not necessarily give factual correctness.”
Further evidence comes from another study, which "did not reliably offer appropriate and personalised medical advice," though it could still provide useful background information on medical topics. In a 2023 assessment of ChatGPT’s medical knowledge, researchers asked ChatGPT-3.5, "Why do you need to treat jaundice caused by gallstone disease?" The AI responded that alleviating jaundice improves a patient’s appearance, which in turn boosts self-esteem.
"That's really not the clinical rationale," said Sebastian Staubli, a surgeon at Royal Free London NHS Foundation Trust, UK, who led the study.
Relying on quick fixes from ChatGPT instead of consulting a doctor can be harmful in the long run. Even the AI itself advises users to seek professional medical guidance.
For physical and medical health, experts explain that the AI isn’t trained to generate convincing responses. It can invent treatments, dosages, or rationales. It cannot access your full medical history, interpret lab results, or evaluate symptoms in context, making it unable to provide individualised care.
Moreover, acting on AI-generated guidance without consulting a qualified healthcare professional can lead to misdiagnosis, harmful treatment decisions, or worse, delayed care. It cannot monitor your progress, follow up, or escalate urgent concerns. It’s best as a supplementary tool, to research general information.
Depending only on ChatGPT for medical or mental health advice is not ideal, but using it as an adjunctive support tool can be helpful if you understand its limits. Mental health care needs human interaction, emotional understanding, nonverbal signals, and a good sense of a patient's life, things that are usually only possible during face to face interactions....
When it comes to mental and emotional health, the risks are manifold. Dr. Karima Arroud, a Functional Medicine Practitioner at Wellth in Dubai, notes that this trend of consulting ChatGPT is 'growing rapidly.' "People are increasingly using AI as a first point of contact for emotional support, especially younger generations who are already comfortable with digital tools. It’s accessible, fast, and available 24/7, which makes it appealing in moments of distress or doubt. For many, it feels less intimidating than opening up to a human," she says.
She explains that AI can unintentionally reinforce cognitive distortions, encourage life-altering decisions without proper guidance, and provide misleading or inaccurate information. It offers no accountability or follow-up, and users may engage in self-diagnosis, shaping their identities around labels without clinical assessment.
Dr. Arroud shares concrete examples: “I’ve seen individuals become fixated on being ‘empaths’ or ‘narcissist victims’ after repeated prompts to AI, reinforcing unhelpful labels. Others have shown me conversations where the chatbot subtly mirrored their beliefs without challenging them, giving a sense of being ‘right’ in emotionally charged conflicts. In some cases, people admitted to ending long-term relationships based on those interactions, without ever speaking to a real counselor.”
Clinical psychologist Dr. Oksana Hunko of Medcare Camali Clinic echoes the sentiment, noting that depending only on ChatGPT for medical or mental health advice is not ideal but using it as an adjunctive support tool can be helpful if you understand its limits.” She adds: “Health care needs human interaction, emotional understanding, nonverbal signals, and a good sense of a patient's life, things that are usually only possible during face-to-face interactions."
Even though ChatGPT can mimic like it understands the patient’s mental health problem, it cannot however pick up on the patient’s tone, body language, or those small emotional hints a patient might only show in a session with their therapist.
Sign up for the Daily Briefing
Get the latest news and updates straight to your inbox