A 14-year-old asked ChatGPT about bodily discomfort and the AI bot convinced the young lad of a gastroenterological infection. The boy was rushed to the ER at Apollo hospital, Mumbai, where his mother mentioned that ChatGPT diagnosed her son’s condition as such.
Dr. Rituprana Ghosh, a clinical psychologist at the hospital, was onboarded for further diagnosis. The result? The physician diagnosed the boy as suffering from an anxiety attack. Reportedly the boy had an episode after being relentlessly bullied by the seniors daily at school. This led to anxiety attacks, which mimic physical discomfort in the form of cramps, stomach pain, etc.
During any stressful event, the body’s blood flow is directed away from the digestive system and the symptom can be misdiagnosed, especially by AI. The good doc could dissect through the boy’s behavior when he would not meet eye to eye while talking and was quivering. As published in the Indian Express, upon further probe, he revealed his ordeal back at the school.
See Also: ChatGPT Fails At Diagnosing Child Medical Cases. It’s Wrong 83 Percent Of The Time.
See Also: Study Finds Heavy ChatGPT Usage Affects Brain Function Adversely; Internet Says ‘We Are Cooked’
More of these AI misdiagnosis gems can be found online even as users claim otherwise. Quoting a study, one netizen reminded how ChatGPT misdiagnosed 83% of Children’s Health Conditions.” Another user, quoting yet another study, remarked, “Reminder: do not trust ChatGPT with your mental health; multiple people have been involuntarily committed due to psychosis triggered by chatting with GPT.” A third user citing the same stated, “Many ChatGPT users are developing all-consuming obsessions with the chatbot, spiraling into severe mental health crises characterized by paranoia, delusions, and breaks with reality.” Yet another user quipped, “I told ChatGPT that I’ve had chest pain since yesterday, and this bitch is saying I’m about to die.”
“The number one issue with ChatGPT is that it can never admit when it doesn’t know something. It must always guess, even if it’s a completely baseless guess. If you call out an incorrect guess, and tell it to stop guessing, it will simply proceed with another incorrect guess,” a user reminded.
ChatGPT Misdiagnosed 83% of Children’s Health Conditions • Children’s Health Defense https://t.co/KPwm9oXbqA
— rayw45 (@rayw45) January 10, 2024
I told ChatGPT that I’ve had chest pain since yesterday, and this bitch is saying I’m about to die.
— Nimra Bajwa (@missingpagessss) July 3, 2025
The consequences can be dire. As we heard from spouses,… pic.twitter.com/foTJE6tOCW
— DIRECT PERCEPTION (@Just_Perceive) June 30, 2025
The number one issue with ChatGPT is that it can never admit when it doesn’t know something. It must always guess, even if it’s a completely baseless guess. If you call out an incorrect guess, and tell it to stop guessing, it will simply proceed with another incorrect guess.
— M. Nolan Gray (@mnolangray) December 31, 2024
Reminder: do not trust ChatGPT with your mental health; multiple people have been involuntarily committed due to psychosis triggered by chatting with GPT https://t.co/nFJUnl6RM0 #ArtificialIntelligence #artificial_intelligence #Innovation #Technology #Tech #TechNews pic.twitter.com/AtR4WMjKjw
— Tim Hughes 提姆·休斯 (@Timothy_Hughes) June 30, 2025
my gp used chatgpt to misdiagnose me with anxiety xx
— NOT spongebob of shedtwt (@qiueky) March 22, 2025
Great Job James Paul & the Team @ Mashable India tech Source link for sharing this story.