361
you are viewing a single comment's thread
view the rest of the comments
[-] Scoopta@programming.dev 3 points 1 week ago* (last edited 1 week ago)

Ollama is also a cool way of running multiple models locally

[-] Retro_unlimited@lemmy.world 1 points 1 week ago

That might be the other one I run, I forget because it’s on my server as a virtual machine (rtx 3080 pass through), but I haven’t used it in a long time.

this post was submitted on 01 Oct 2024
361 points (91.1% liked)

Programmer Humor

19315 readers
94 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS