ChatGPT may be able to provide accurate answers to common questions about diabetes care, but it lacks nuance in some of its responses because it’s trained on a general, not medical, database, researchers wrote in Diabetes Care. ChatGPT also doesn’t have the most up-to-date information, and it’s prone to “hallucination" where it presents inaccurate data in a persuasive way, the researchers added.
Full Story: Medscape (free registration) (4/3)