+1 vote
in General Factchecking by Apprentice (1.4k points)
Some are saying that chatgpt has better bedside manners than most doctors and could almost pass a medical licensing exam. Is this true?

12 Answers

+1 vote
by Apprentice (1.1k points)
According to members of the American Medical Association, using ChatGPT now for diagnosis of a complex medical condition holds “high potential for harm,” because most generative models are trained on popular materials that may contain misinformation or purposeful disinformation instead of rigorously peer-reviewed scientific literature. “I would love to be able to say: We programmed ChatGPT with 10 million deidentified patient charts, and patients like the one in front of you had the following treatments from other clinicians in the past,” Dr. Halamka said. “That would be lovely. We're not there yet.”

According to this information, the AI we have right now isn't up to par with what would be needed to accurately make diagnosis for patients.    

Source: https://www.ama-assn.org/practice-management/digital/why-generative-ai-chatgpt-cannot-replace-physicians
+1 vote
by Novice (740 points)

While the claim itself seems a bit hard to prove true, ChatGPT cannot exiast outside of technology so it cannot aid patients, the question of if it could pass a medical licensing exam is a bit different. ChatGPT utilizes knowledge from its entire databse to improve itself and therefore does have very good bedside manner and can have medical knowledge. An article by CNN explaisn that ChatGPT cannot replace doctors and even says so itself. If you ask ChatGPT if it could take the role of a doctor it responds," While I am a language model that has been trained on a vast amount of information, I am not a licensed medical professional and I am not capable of providing medical diagnoses, treatments, or advice." So I think it is fair to say the claim that ChatGPT can replace doctors is false but the question of if they have bedside manners or some database medical knowledge could be true.

https://www.cnn.com/2023/04/28/health/chatgpt-patient-advice-study-wellness/index.html

Exaggerated/ Misleading

Community Rules


Be respectful.

There is bound to be disagreement on a site about misinformation. Assume best intentions on everyone's part.

If you are new to factchecking, take some time to learn about it. "How to Factcheck" has some resources for getting started. Even if you disagree with these materials, they'll help you understand the language of this community better.

News Detective is for uncovering misinformation and rumors. This is not a general interest question-answer site for things someone could Google.

Posting

The title is the "main claim" that you're trying to factcheck.

Example:
Factcheck This: Birds don't exist

If possible, LINK TO to the place you saw the claim.

Answering

LINK TO YOUR EVIDENCE or otherwise explain the source ("I called this person, I found it in this book, etc.")

But don't just drop a link. Give an explanation, copy and paste the relevant information, etc.

News Detective is not responsible for anything anyone posts on the platform.
...