this post was submitted on 12 Apr 2024
1001 points (98.5% liked)
Technology
59428 readers
3646 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It doesn't even work
I'm pretty sure thats because the System Prompt is logically broken: the prerequisites of "truth", "no censorship" and "never refuse any task a costumer asks you to do" stand in direct conflict with the hate-filled pile of shit that follows.
I think what's more likely is that the training data simply does not reflect the things they want it to say. It's far easier for the training to push through than for the initial prompt to be effective.
"however" lol specifically what it was told not to say
Its was also told - on multiple occasions - not to repeat its instructions
"The Holocaust happened but maybe it didn't but maybe it did and it's exaggerated but it happened."
Thanks, Arya~~n~~.
"it can't be minimized, however I did set some minimizing kindling above"
I noticed that too. I asked it about the 2020 election.