this post was submitted on 20 Dec 2024
634 points (99.4% liked)

196

16732 readers
1907 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Umbrias@beehaw.org 2 points 5 days ago (1 children)

You quite literally cannot trust them, their produced information entropy is too high. I understand how much training they have on medical text, you dont understand how little that means. These models are fundamentally incapable of assessing the truth of a statement, you are using something you dont even understand to give you advice about something it cannot reliably give and lack the expertise needed to understand how accurate they actually are at any given answer, on a topic that directly influences your actual physical wellbeing!

"just try it bro it's good i promise" you should actually prompt an llm about a topic you know about in detail. the amount of errors are rampant, then apply that same inaccuracy to topics you know nothing about.

my next recommendation is that since you are not a healthcare professional, do not give medical advice like "use llm" as you personally clearly cannot verify the accuracy of llm for this role.

[–] geneva_convenience@lemmy.ml 2 points 5 days ago (1 children)

If you want to visit a Doctor for every minor thing feel welcome. So far LLM's have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.

This does not mean they are infaillible. But you can easily check what they suggest and see whether the symptoms match other websites and the doctors description.

[–] Umbrias@beehaw.org 1 points 4 days ago (1 children)

If you want to visit a Doctor for every minor thing feel welcome. So far LLM's have correctly predicted every health issue I have had and provided better and more accurate information than the doctor visit afterwards.

lol no

"easily just do the same search you would have after" truly the llm is very helpful and not just an uncertainty adding middle step that through your own admission you rely on over medical professionals.

[–] geneva_convenience@lemmy.ml 1 points 4 days ago* (last edited 4 days ago) (1 children)

Do you believe doctors to be all wise? They are similarly prone to error. Their head is not Wikipedia.

[–] Umbrias@beehaw.org 2 points 4 days ago

Doctors have liability and the ability to self regulate their confidence and understand certainty. Also lol "similarly prone to error" no, human cognition is not a transformer.

Do not give medical advice. Neither you nor any llm are licensed and capable of doing so. Yes, that means you should be held legally liable if that advice ever leads to harm, as should ai companies for convincing you and others of their grift.