388
ChatGPT In Trouble: OpenAI may go bankrupt by 2024, AI bot costs company $700,000 every day
(www.firstpost.com)
This is a most excellent place for technology news and articles.
Makes sense. In that case I guess your next best option is probably to buy or rent hardware to run the local models that are suitable for chat rp.
I have definitely been considering it. My current hardware gives me about an 80 second delay when I run an llm locally.
Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that's actually good for LLM will come out in the next few years that's not 15k and made for servers.