this post was submitted on 30 Sep 2023
874 points (98.8% liked)

Open Source

31256 readers
272 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] StickBugged@lemm.ee 9 points 1 year ago (1 children)

If you ask how to build a bomb and it tells you, wouldn't Mozilla get in trouble?

[–] mojo@lemm.ee 0 points 1 year ago* (last edited 1 year ago) (2 children)

Do gun manufacturers get in trouble when someone shoots somebody?

Do car manufacturers get in trouble when someone runs somebody over?

Do search engines get in trouble if they accidentally link to harmful sites?

What about social media sites getting in trouble for users uploading illegal content?

Mozilla doesn't need to host an uncensored model, but their open source AI should be able to be trained to uncensored. So I'm not asking them to host this themselves, which is an important distinction I should have made.

Which uncensored LLMs exist already, so any argument about the damage they can cause is already possible.

[–] Spzi@lemm.ee 1 points 1 year ago (1 children)

Do car manufacturers get in trouble when someone runs somebody over?

Yes, if it can be shown the accident was partially caused by the manufacturer's neglect. If a safety measure was not in place or did not work properly. Or if it happens suspiciously more often with models from this brand. Apart from solid legal trouble, they can get into PR trouble if many people start to think that way, no matter if it's true.

[–] mojo@lemm.ee 1 points 1 year ago (1 children)
[–] Spzi@lemm.ee 1 points 1 year ago (1 children)

Then let me spell it out: If ChatGPT convinces a child to wash their hands with self-made bleach, be sure to expect lawsuits and a shit storm coming for OpenAI.

If that occurs, but no liability can be found on the side of ChatGPT, be sure to expect petitions and a shit storm coming for legislators.

We generally expect individuals and companies to behave in society with peace and safety in mind, including strangers and minors.

Liabilities and regulations exist for these reasons.

[–] mojo@lemm.ee 1 points 1 year ago

Again... this is still missing the point.

Let me spell it out: I'm not asking for companies to host these services. They are not held liable.

For this example to be related, ChatGPT would need to be open source and let you plug in your own model. We should have the freedom to plug in our own trained models, even uncensored ones. This is the case with LLAma and other AI systems right now, and I'm encouraging Mozilla's AI to allow us to do the same thing.

[–] TheBat@lemmy.world 0 points 1 year ago

Why are lolbertarians on lemmy?