this post was submitted on 15 Jun 2023
3 points (100.0% liked)

Stable Diffusion

1487 readers
1 users here now

Welcome to the Stable Diffusion community, dedicated to the exploration and discussion of the open source deep learning model known as Stable Diffusion.

Introduced in 2022, Stable Diffusion uses a latent diffusion model to generate detailed images based on text descriptions and can also be applied to other tasks such as inpainting, outpainting, and generating image-to-image translations guided by text prompts. The model was developed by the startup Stability AI, in collaboration with a number of academic researchers and non-profit organizations, marking a significant shift from previous proprietary models that were accessible only via cloud services.

founded 1 year ago
MODERATORS
 

I really want to setup an instance at home I know the MI25 can be hacked into doing this well but I would love to know what other people are running and see if I can find a good starter kit.

THX ~inf

top 6 comments
sorted by: hot top controversial new old
[–] infinitevalence 1 points 1 year ago

Welp, at $75 I picked up a MI25 to play with. God help me!

[–] korewa@reddthat.com 1 points 1 year ago

3080 10gb but recently experimented with LLM and it eats up vram. Looking for a dual 3090 or 4090 for 48gb of vram.

When is SDXL releasing for people to train?

[–] fupuyifi@kbin.social 1 points 1 year ago (1 children)

NVIDIA is going to be faster and easier to be compatible with SD.
VRAM is going to be your friend especially if you start working with Deforum and video

[–] infinitevalence 2 points 1 year ago (1 children)

Hum not a big fan of running Nvidia on Linux... What's a minimum number 12g?

[–] fupuyifi@kbin.social 1 points 1 year ago (1 children)

Why not?

I'm currently running an NVIDIA GPU on Debian with SD and I haven't had any issues.

[–] infinitevalence 2 points 1 year ago

More their attitude to the FOSS community. They are kinda like apple where they don't really contribute back.