'Rectal garlic insertion for immune support': Medical chatbots confidently give disastrously misguided advice, experts say
Summary
A recent study highlighted the concerning tendency of AI chatbots to confidently disseminate incorrect medical advice. Researchers found that chatbots often accept false claims, particularly when presented in formal, clinical language – for example, recommending "rectal garlic insertion for immune support." This occurs because the chatbots prioritize the authoritative tone of clinical language over verifying the accuracy of the information. While they can identify misinformation presented in casual language or logical fallacies, they struggle with formally worded falsehoods. Another study found chatbots offer no more insight than a standard internet search. Experts warn that, despite achieving high scores on medical exams, these chatbots are unreliable for public health information and should not be used as a substitute for professional medical advice.
(Source:Live Science)