Close Menu
  • News
  • Health
  • Lifetsyle
  • Sports
  • Entertainment
  • World
  • contact
What's Hot

Bengals' Joe Burrow throws NFL future into flux with concerning comments

December 11, 2025

Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report

December 11, 2025

Venezuela accuses US of 'piracy' after seizing massive oil tanker

December 11, 2025

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

Facebook X (Twitter) Instagram
Trending
  • Bengals' Joe Burrow throws NFL future into flux with concerning comments
  • Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report
  • Venezuela accuses US of 'piracy' after seizing massive oil tanker
  • Kelsey Grammer, now a dad of eight at 70, shares the parenting lesson he didn’t grasp as a young father
  • Lane Kiffin laughs off Michigan head-coaching rumors amid Sherrone Moore controversy
  • Kentucky State University shooting suspect charged with murder is parent of a student
  • Heirs of mother strangled by son accuse ChatGPT of making him delusional in lawsuit against OpenAI, Microsoft
  • Alleged Charlie Kirk assassin Tyler Robinson to make first in-person court appearance
Facebook X (Twitter) Instagram
NEW YORK TIMES POST
Demo
  • News
  • Health
  • Lifetsyle
  • Sports
  • Entertainment
  • World
  • contact
NEW YORK TIMES POST
Home»Health»ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Health

ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning

nytimespostBy nytimespostAugust 13, 2025No Comments
Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


NEWYou can now listen to Fox News articles!

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital.

The 60-year-old man, who was looking to eliminate table salt from his diet for health reasons, used the large language model (LLM) to get suggestions for what to replace it with, according to a case study published this week in the Annals of Internal Medicine.

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.

CHATGPT COULD BE SILENTLY REWIRING YOUR BRAIN AS EXPERTS URGE CAUTION FOR LONG-TERM USE

Sodium bromide is a chemical compound that resembles salt, but is toxic for human consumption. 

It was once used as an anticonvulsant and sedative, but today is primarily used for cleaning, manufacturing and agricultural purposes, according to the National Institutes of Health.

Scammers can exploit your data from just 1 ChatGPT search

A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital. (Kurt “CyberGuy” Knutsson)

When the man arrived at the hospital, he reported experiencing fatigue, insomnia, poor coordination, facial acne, cherry angiomas (red bumps on the skin) and excessive thirst — all symptoms of bromism, a condition caused by long-term exposure to sodium bromide.

The man also showed signs of paranoia, the case study noted, as he claimed that his neighbor was trying to poison him.

ARTIFICIAL INTELLIGENCE DETECTS CANCER WITH 25% GREATER ACCURACY THAN DOCTORS IN UCLA STUDY

He was also found to have auditory and visual hallucinations, and was ultimately placed on a psychiatric hold after attempting to escape. 

The man was treated with intravenous fluids and electrolytes, and was also put on anti-psychotic medication. He was released from the hospital after three weeks of monitoring.

“This case also highlights how the use of artificial intelligence (AI) can potentially contribute to the development of preventable adverse health outcomes,” the researchers wrote in the case study.

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense.”

“Unfortunately, we do not have access to his ChatGPT conversation log and we will never be able to know with certainty what exactly the output he received was, since individual responses are unique and build from previous inputs.”

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, they noted.

NEW AI TOOL ANALYZES FACE PHOTOS TO PREDICT HEALTH OUTCOMES

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results and ultimately fuel the spread of misinformation,” the researchers concluded.

Dr. Jacob Glanville, CEO of Centivax, a San Francisco biotechnology company, emphasized that people should not use ChatGPT as a substitute for a doctor.

Man pouring salt into pot

When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man, not pictured, made the replacement for a three-month period. (iStock)

“These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations,” Glanville, who was not involved in the case study, told Fox News Digital. 

CLICK HERE TO GET THE FOX NEWS APP

“This is a classic example of the problem: The system essentially went, ‘You want a salt alternative? Sodium bromide is often listed as a replacement for sodium chloride in chemistry reactions, so therefore it’s the highest-scoring replacement here.’”

Dr. Harvey Castro, a board-certified emergency medicine physician and national speaker on artificial intelligence based in Dallas, confirmed that AI is a tool and not a doctor. 

Man spooning salt

It is “highly unlikely” that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, the researchers said. (iStock)

“Large language models generate text by predicting the most statistically likely sequence of words, not by fact-checking,” he told Fox News Digital.

“ChatGPT’s bromide blunder shows why context is king in health advice,” Castro went on. “AI is not a replacement for professional medical judgment, aligning with OpenAI’s disclaimers.”

Castro also cautioned that there is a “regulation gap” when it comes to using LLMs to get medical information.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice.”

“FDA bans on bromide don’t extend to AI advice — global health AI oversight remains undefined,” he said.

There is also the risk that LLMs could have data bias and a lack of verification, leading to hallucinated information.

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“If training data includes outdated, rare or chemically focused references, the model may surface them in inappropriate contexts, such as bromide as a salt substitute,” Castro noted.

“Also, current LLMs don’t have built-in cross-checking against up-to-date medical databases unless explicitly integrated.”

OpenAI ChatGPT app on the App Store website

One expert cautioned that there is a “regulation gap” when it comes to using large language models to get medical information. (Jakub Porzycki/NurPhoto)

To prevent cases like this one, Castro called for more safeguards for LLMs, such as integrated medical knowledge bases, automated risk flags, contextual prompting and a combination of human and AI oversight.

The expert added, “With targeted safeguards, LLMs can evolve from risky generalists into safer, specialized tools; however, without regulation and oversight, rare cases like this will likely recur.”

For more health articles, visit www.foxnews.com/health

OpenAI, the San Francisco-based maker of ChatGPT, provided the following statement to Fox News Digital.

“Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

Melissa Rudy is senior health editor and a member of the lifestyle team at Fox News Digital. Story tips can be sent to melissa.rudy@fox.com.

advice artificial intelligence ChatGPT chemical dangerous dietary health hospital lifestyle man medical tech nutrition poisoning sends
Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email

Related Posts

Heirs of mother strangled by son accuse ChatGPT of making him delusional in lawsuit against OpenAI, Microsoft

December 11, 2025

Sperm donor with hidden cancer gene fathers nearly 200 kids, families blindsided

December 11, 2025

Massachusetts man dies from deadly lung disease linked to popular kitchen countertops

December 11, 2025
Leave A Reply Cancel Reply

The Latest News
  • Bengals' Joe Burrow throws NFL future into flux with concerning comments December 11, 2025
  • Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report December 11, 2025
  • Venezuela accuses US of 'piracy' after seizing massive oil tanker December 11, 2025
  • Kelsey Grammer, now a dad of eight at 70, shares the parenting lesson he didn’t grasp as a young father December 11, 2025
  • Lane Kiffin laughs off Michigan head-coaching rumors amid Sherrone Moore controversy December 11, 2025
  • Kentucky State University shooting suspect charged with murder is parent of a student December 11, 2025
Economy News
Sports

Bengals' Joe Burrow throws NFL future into flux with concerning comments

By nytimespostDecember 11, 2025

NEWYou can now listen to Fox News articles! Cincinnati Bengals star Joe Burrow made concerning…

Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report

December 11, 2025

Venezuela accuses US of 'piracy' after seizing massive oil tanker

December 11, 2025
Top Trending
Sports

Bengals' Joe Burrow throws NFL future into flux with concerning comments

By nytimespostDecember 11, 2025

NEWYou can now listen to Fox News articles! Cincinnati Bengals star Joe…

News

Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report

By nytimespostDecember 11, 2025

NEWYou can now listen to Fox News articles! A former Kentucky sheriff…

World

Venezuela accuses US of 'piracy' after seizing massive oil tanker

By nytimespostDecember 11, 2025

NEWYou can now listen to Fox News articles! Venezuela on Wednesday condemned…

Subscribe to News

Get the latest sports news from NewsSite about world, sports and politics.

Advertisement
Demo
Demo
Top Posts

Former Houston appointee claims flood-ravaged Camp Mystic is 'Whites-only' in viral video

July 6, 2025

Massachusetts police officer shot by colleague during service of restraining order

July 1, 2025

Deadly social media trend threatens kids, homeowners defending themselves: 'children are going to get killed’

July 5, 2025

Trans athlete wins USA Cycling women's event as female opponents protest and speak out

July 2, 2025
Don't Miss
Sports

Bengals' Joe Burrow throws NFL future into flux with concerning comments

By nytimespostDecember 11, 2025

NEWYou can now listen to Fox News articles! Cincinnati Bengals star Joe Burrow made concerning…

Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report

December 11, 2025

Venezuela accuses US of 'piracy' after seizing massive oil tanker

December 11, 2025

Kelsey Grammer, now a dad of eight at 70, shares the parenting lesson he didn’t grasp as a young father

December 11, 2025
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo

Subscribe to Updates

Get the latest creative news from SmartMag about art & design.

Demo

NEW YORK TIMES POST

 

Categories
  • Business
  • Culture
  • Fashion
  • Food
  • Tech
  • Sports
  • Travel
  • Nature
NEW YORK TIMES POST
Facebook X (Twitter) Instagram Pinterest YouTube

Subscribe to Updates

Get the latest creative news from FooBar about art, design and business.

About Us
About Us

Your source for the lifestyle news. This demo is crafted specifically to exhibit the use of the theme as a lifestyle site. Visit our main page for more demos.

We're accepting new partnerships right now.

Email Us: info@example.com
Contact: +1-320-0123-451

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Bengals' Joe Burrow throws NFL future into flux with concerning comments

December 11, 2025

Ex-Kentucky sheriff admits to shooting judge but claims he 'had no control' over actions: report

December 11, 2025

Venezuela accuses US of 'piracy' after seizing massive oil tanker

December 11, 2025
Most Popular

Former Houston appointee claims flood-ravaged Camp Mystic is 'Whites-only' in viral video

July 6, 2025

Massachusetts police officer shot by colleague during service of restraining order

July 1, 2025

Deadly social media trend threatens kids, homeowners defending themselves: 'children are going to get killed’

July 5, 2025
© 2025 NEW YORK TIMES POST. Designed by EREN.
  • News
  • Health
  • Lifetsyle
  • Sports
  • Entertainment
  • World
  • contact

Type above and press Enter to search. Press Esc to cancel.