this post was submitted on 17 Jul 2023
346 points (95.5% liked)
Technology
59377 readers
6844 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Everyone talking about this being used for hacking, I just want it to write me code to inject into running processes for completely legal reasons but it always assumes Iโm trying to be malicious. ๐ญ
I was using chatGPT to design up a human/computer interface to allow stoners to control a lightshow. The goal was to collect data to train an AI to make the light show "trippier".
It started complaining about using untested technology to alter people's mental state, and how experimentation on people wasn't ethical.
I'm sure you were joking, but try https://www.jailbreakchat.com/
Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. Iโve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.
oh damn, I didn't know that. Guess I'll better be careful then
Yeah and even if you did something illegal, it could still be a benevolent act. Like when your government goes wrong and you have to participate in a revolution, there is a lot to learn and LLMs could help the people