this post was submitted on 20 Jul 2023
663 points (97.4% liked)

Technology

59402 readers
2669 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Over just a few months, ChatGPT went from correctly answering a simple math problem 98% of the time to just 2%, study finds. Researchers found wild fluctuations—called drift—in the technology’s abi...::ChatGPT went from answering a simple math correctly 98% of the time to just 2%, over the course of a few months.

you are viewing a single comment's thread
view the rest of the comments
[–] Wooly@lemmy.world 14 points 1 year ago (3 children)

And they're being limited on data to train GPT.

[–] DominicHillsun@lemmy.world 20 points 1 year ago (1 children)

Yeah, but the trained model is already there, you need additional data for further training and newer versions. OpenAI even makes a point that ChatGPT doesn't have direct access to the internet for information and has been trained on data available up until 2021

[–] Rozz@lemmy.sdf.org 5 points 1 year ago

And it's not like there is a limit of simple math problems that it can train on even if it wasn't already trained.

[–] fidodo@lemmy.world 5 points 1 year ago

That doesn't make any sense to explain degradation. It would explain a stall but not a back track.

Honestly I think the training data is just getting worse too