Chatgpt is such a disloyal snarky piece of shit that a database 90% as good but 2000% more obedient is better in every way.
For stable diffusion image generation you need an nvidia gpu for reasonable speeds. As long as you actually enable multithreading, in my case 8 cores, you can get really good performance in llamacpp (and by extension gpt4all since it runs on llamacpp). My uncensored ai is fast enough to be used on demand like chatgpt and I use it pretty much every day.
Chatgpt is such a disloyal snarky piece of shit that a database 90% as good but 2000% more obedient is better in every way.
For stable diffusion image generation you need an nvidia gpu for reasonable speeds. As long as you actually enable multithreading, in my case 8 cores, you can get really good performance in llamacpp (and by extension gpt4all since it runs on llamacpp). My uncensored ai is fast enough to be used on demand like chatgpt and I use it pretty much every day.