- AI chatbots give misleading medical advice 50% of the time, study finds The Japan Times —
- AI chatbots give misleading medical advice 50% of the time, study finds The Straits Times —
- AI chatbots give misleading medical advice 50% of the time, study finds The Straits Times —
- Approximately half of AI's medical responses are "problematic," study finds CBS News —
- AI chatbots often âhallucinateâ and give inaccurate medical information â study LBC —
- Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate Daily Mail —
- AI chatbots often ‘hallucinate’ and give inaccurate medical information – study Belfast Telegraph —
- AI chatbots often ‘hallucinate’ and give inaccurate medical information – study The Standard —
- AI Chatbots Give Misleading Medical Advice 50% of the Time, Study Finds Bloomberg —
- We are AI experts. Here are the dangers of using chatbots for health and medical information The Independent —
AI medical advice often inaccurate
The study analyzed responses to 50 common medical questions and found that half of the information provided was 'problematic' or incomplete.
While the chatbots often delivered their answers with high confidence, none were able to produce a fully accurate reference list for their claims.
Experts are now urging the public to treat AI-generated health advice with extreme caution and to consult qualified professionals for medical concerns.
The findings highlight the urgent need for better regulation and transparency as generative AI becomes increasingly integrated into daily life.
Gemini
Zodiac constellation in the northern hemisphere
Zodiac constellation in the northern hemisphere
Nicholas B Tiller
Researcher
Researcher