this post was submitted on 23 Jun 2023
78 points (95.3% liked)
Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ
54627 readers
712 users here now
⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.
Rules • Full Version
1. Posts must be related to the discussion of digital piracy
2. Don't request invites, trade, sell, or self-promote
3. Don't request or link to specific pirated titles, including DMs
4. Don't submit low-quality posts, be entitled, or harass others
Loot, Pillage, & Plunder
📜 c/Piracy Wiki (Community Edition):
💰 Please help cover server costs.
Ko-fi | Liberapay |
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In addition to the online platforms linked by the other commenters, it's also pretty straightforward to run Stable Diffusion locally, if your hardware is beefy enough: https://github.com/AUTOMATIC1111/stable-diffusion-webui
Various fine-tunes checkpoints for different content and art styles can be downloaded at civitai.
Best way right here. Free, open sourced, and you wont get judged for your outputs.
absolutely this. I have been messing around with this for about a week now.
super fun and easy to set up. I used this since I wanted a docker env.
(side note, Does anyone know why I can't upload pictures directly from web? getting
SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data
)edit: its because of size....
This is the way. The really top-tier AI art is almost guaranteed to use this, most online tools and other frontends just don't have the features. Also, here is a link to a fork of that with an improved UI (no other changes).
Beefy can mean things to different people too. I have a mobile 1660ti and it can generate images in decent times (about 40seconds for a 20iteration image from prompt)
I'm slightly lacking in VRAM though, something 8GB VRAM would allow you to use most models.
yeah thats fair enough on the wordage.
Im rocking a 3070 and 11th gen i7, but only 16gb of ram.
still pretty quick imo
Took longer for my browser to download the image than it took for you to generate it. :)
Fun fact, it can be run on as low as 2gb vram! It works out of the box with the --lowvram parameter, and with some extra fiddling with extensions you can even generate high resolution stuff.