14-Year-Old Boy Misdiagnosed By ChatGPT Of Gastric Infection; Turns Out He Had Anxiety Attack Owing To Bullying

A 14-year-old asked ChatGPT about bodily discomfort and the AI bot convinced the young lad of a gastroenterological infection. The boy was rushed to the ER at Apollo hospital, Mumbai, where his mother mentioned that ChatGPT diagnosed her son’s condition as such.

Dr. Rituprana Ghosh, a clinical psychologist at the hospital, was onboarded for further diagnosis. The result? The physician diagnosed the boy as suffering from an anxiety attack. Reportedly the boy had an episode after being relentlessly bullied by the seniors daily at school. This led to anxiety attacks, which mimic physical discomfort in the form of cramps, stomach pain, etc.

During any stressful event, the body’s blood flow is directed away from the digestive system and the symptom can be misdiagnosed, especially by AI. The good doc could dissect through the boy’s behavior when he would not meet eye to eye while talking and was quivering. As published in the Indian Express, upon further probe, he revealed his ordeal back at the school.

See Also: ChatGPT Fails At Diagnosing Child Medical Cases. It’s Wrong 83 Percent Of The Time.

See Also: Study Finds Heavy ChatGPT Usage Affects Brain Function Adversely; Internet Says ‘We Are Cooked’

More of these AI misdiagnosis gems can be found online even as users claim otherwise. Quoting a study, one netizen reminded how ChatGPT misdiagnosed 83% of Children’s Health Conditions.” Another user, quoting yet another study, remarked, “Reminder: do not trust ChatGPT with your mental health; multiple people have been involuntarily committed due to psychosis triggered by chatting with GPT.” A third user citing the same stated, “Many ChatGPT users are developing all-consuming obsessions with the chatbot, spiraling into severe mental health crises characterized by paranoia, delusions, and breaks with reality.” Yet another user quipped, “I told ChatGPT that I’ve had chest pain since yesterday, and this bitch is saying I’m about to die.”

“The number one issue with ChatGPT is that it can never admit when it doesn’t know something. It must always guess, even if it’s a completely baseless guess. If you call out an incorrect guess, and tell it to stop guessing, it will simply proceed with another incorrect guess,” a user reminded.

Cover: Illustrative / Pexels /Pexels

See Also: ‘Path To Medical Superintelligence’ Microsoft Claims To Build Superior AI Tool That Outperforms Doctors In Diagnosis

See Also: Mother Of Two Thanks ChatGPT For Saving Her Life By Detecting Cancer When Doctors Assumed It Was Arthritis

Great Job James Paul & the Team @ Mashable India tech Source link for sharing this story.

#FROUSA #HillCountryNews #NewBraunfels #ComalCounty #LocalVoices #IndependentMedia

Felicia Ray Owens
Felicia Ray Owenshttps://feliciaray.com
Happy wife of Ret. Army Vet, proud mom, guiding others to balance in life, relationships & purpose.

Latest articles

spot_img

Related articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Leave the field below empty!

spot_img
Secret Link