Nowadays, the use of AI has increased so much that people have started trusting their treatment to ChatGPT. Let us tell you how dangerous it can be for you.

Nowadays, the use of Artificial Intelligence (AI) is increasing rapidly. People use technology like Google, Chatbot or ChatGPT for small and big information. But sometimes this habit can also be dangerous. A case has come to light in America, in which a person became so ill after following the advice of AI that he had to be admitted to the ICU.

AI suggested dangerous options

 

 

Actually, this person was very conscious about health and often used to read about the harms of table salt. One day he asked ChatGPT what could be used in place of salt. The AI suggested several options, one of which was "sodium bromide". The chatbot told that it is an alternative to chloride, but did not tell that it can be dangerous for humans.

The man believed this advice and started consuming sodium bromide for about three months without asking the doctor. Everything was fine in the beginning, but gradually his health started deteriorating. He started getting frequent confusions, strange thoughts started coming and he started suspecting people. The situation became such that he started thinking that his neighbor was poisoning him.

Gradually deteriorating health

Sodium bromide was once used to treat insomnia and anxiety, but was phased out because of its severe side effects. Today it is mostly found in veterinary drugs and industrial products. So cases of human poisoning are extremely rare.

When the person's condition worsened, he was taken to the hospital. After examination, the doctors found that he had become a victim of "bromide toxicity". He was immediately given intravenous fluids and antipsychotic medicines. Gradually his condition improved and after a week he started talking normally. After three weeks of treatment, he was discharged from the hospital.

Doctors warned

Later, the doctors also said that when they asked ChatGPT the same question, it again suggested bromide as an option, but did not clarify that it is unsafe for humans. Experts say that this incident tells us that the information received from AI is not always complete and safe, especially in the case of health and medicines. AI can tell the symptoms, but it is not necessary that it also tells all the possible causes and risks. For example, weight loss can be a symptom of cancer, but it also occurs in many other diseases. Therefore, one should always consult a doctor in health related matters. Information received from the Internet and AI can only be for initial understanding, not the basis of treatment.

Read more
Rishi Panchami 2025: Rishi Panchami 2025 is special for women, know the reason..
Newspoint
Arrests Made in Poaching Incident at Kuldiha Wildlife Sanctuary
Newspoint
PGCIL recruitment 2025: Apply for 1543 Field Engineer and other posts till Sept 17, details here
Newspoint
Love marriage, two luxurious showrooms, and a 36-page suicide note… how did this millionaire businessman's entire family perish?
Newspoint
Pallavi Joshi reveals how she cooled off after doing gory scenes in 'The Bengal Files'
Newspoint
Fashion: Tote bags are good for the environment as well as style, they are becoming the first choice of the youth..
Newspoint
PGCIL Jobs: Great opportunity for field engineers and supervisors! 1500 jobs released in Power Grid; Know the eligibility..
Newspoint
Will Koraput Finally Get a Daytime Train to Bhubaneswar? Residents Hopeful!
Newspoint
Lionel Messi Shines in Inter Miami's Leagues Cup Semifinal Victory
Newspoint
Stay Updated with Today's Trending News and Real-Time Updates
Newspoint