Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate
AI chatbots consistently give 'highly' problematic medical advice, with few caveats or disclaimers, that could present substantial risk to users, experts have warned.
AI chatbots give misleading health advice
- AI chatbots give misleading medical advice 50% of the time, study finds The Japan Times —
- AI chatbots give misleading medical advice 50% of the time, study finds The Straits Times —
- AI chatbots give misleading medical advice 50% of the time, study finds The Straits Times —
- AI Chatbots Give Misleading Medical Advice 50% of the Time, Study Finds Bloomberg —
- AI chatbots often âhallucinateâ and give inaccurate medical information â study LBC —