this post was submitted on 10 Sep 2023
87 points (79.6% liked)
Technology
59446 readers
3868 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It really needs to be pinned to the top of every single discussion around chatgbt:
It does not give answers because it knows. It gives answers because it thinks it looks right.
Remember back in school when you didn't study for a test and went through picking answers that "looked right" because you vaguely remember hearing the words in Answer B during class at some point?
It will never have wisdom and intuition from experience, and that's critically important for doctors.
“Looks right” in a human context means the one that matches a person’s actual experience and intuition. “Looks right” in an LLM context means the series of words have been seen together often in the training data (as I understand it, anyway - I am not an expert).
Doctors are most certainly not choosing treatment based on what words they’ve seen together.