this post was submitted on 07 Jul 2023
7 points (88.9% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
so looking at your "server" it seems like a workstation i have no experience with k80's but from what i know all server gpu's are designed passive to be cooled by some loud fingerremover5000's so i think if you would upgrade the fans it should be fine since its only 300w. if cooling really is a problem than maybe some shrouds might help it but i don't think a single k80 is difficult to cool since in the data center they probably ran 4-6 in one 4u chassis