19

For example, does a 13B parameter model at 2_K quantiation perform worse than a 7B parameter model at 8bit or 16bit?

you are viewing a single comment's thread
view the rest of the comments
[-] alexthelion335@sh.itjust.works 2 points 1 year ago
[-] rufus@discuss.tchncs.de 5 points 1 year ago

Well, a few of those extra numbers are my fault. I edited my answer a few times. And lemmy reportedly counts every edit as an additional comment. (When user and community are on different instances.) I hope they fix that soon.

[-] noneabove1182@sh.itjust.works 2 points 1 year ago

ahh makes sense, i just made a post and deleted the comment i made on it but it glitched and deleted twice so now my post has -1 comments lmao

this post was submitted on 26 Jul 2023
19 points (100.0% liked)

LocalLLaMA

2214 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS