Artificial Intelligence: Poisoned on AI's recommendation

Reading time: 3 min.
|A 60-year-old man's health advice from Chat-GPT resulted in paranoia and hallucinations. He had replaced his table salt with bromide salt.
Just ask Chat-GPT: It's tempting to ask a chatbot in the kitchen for quick advice, perhaps because you're missing baking powder for your cake. But a 60-year-old man from the USA recently discovered that its confidently worded answers aren't always reliable, and can sometimes even be dangerous. His case was recently described by physicians led by Audrey Eichenberger from the University of Washington in the journal Annals of Internal Medicine: Clinical Cases . A nutritional tip from Chat-GPT led to poisoning in the man, which is rarely seen these days.
süeddeutsche