361
you are viewing a single comment's thread
view the rest of the comments
[-] passepartout@feddit.org 13 points 1 week ago

If you have a supported GPU you could try Ollama (with openwebui), works like a charm.

[-] bi_tux@lemmy.world 6 points 1 week ago

you don't even need a supported gpu, I run ollama on my rx 6700 xt

[-] passepartout@feddit.org 2 points 6 days ago

I have the same gpu my friend. I was trying to say that you won't be able to run ROCm on some Radeon HD xy from 2008 :D

[-] BaroqueInMind@lemmy.one 3 points 1 week ago

You don't even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af

[-] bi_tux@lemmy.world 2 points 1 week ago

I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

[-] tomjuggler@lemmy.world 2 points 6 days ago

I ran it on my dual core celeron and.. just kidding try the mini llama 1B. I'm in the same boat with Ryzen 5000 something cpu

this post was submitted on 01 Oct 2024
361 points (91.1% liked)

Programmer Humor

19315 readers
94 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS