this post was submitted on 16 Jul 2023
54 points (96.6% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54577 readers
198 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
54
Pirating AI models (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by zaknenou@lemmy.dbzer0.com to c/piracy@lemmy.dbzer0.com
 

So it is convenient and all to use chatGPT from open AI site and to use other AI models on their official sites, but doesn't feel like a pirate when doing this am I wrong? like OpenAI staff spying on my discussion with my waifu persona I given to ChatGPT, or Midjourney creators knowing about every picture of John Oliver I created with their discord bot, Also you are only allowed to use chatGPT 3.5 and need to pay for GPT 4 (20$ a month for a limited use wtf) so are there any islands where the pirate can do what he does comfortably?

there are Telegram bots. Also Quora suggests multiple AI models on https://poe.com/ . but I'm curious if there exists some compilation of pirated useful AI tools

EDIT: Thank you everyone from inside and outside this instance.

you are viewing a single comment's thread
view the rest of the comments
[–] PrimaCora@lemmy.fmhy.ml 20 points 1 year ago

You can't pirate their models, and even if they leaked, running them would need an expensive machine.

There are lots of open source models. They can get close but are limited by your hardware.

If you want close to GPT, there is the falcon 40b model. You'll need something with more than 24 GB VRAM or deep down cpu offload with 128 GB RAM, I think, maybe 64.

With 24 GB VRAM you can do a 30B and so on...

For reference, the GPT models are like 135B. So a100 nvlink territory.