14
... (programming.dev)
submitted 10 months ago* (last edited 7 months ago) by CoderSupreme@programming.dev to c/localllama@sh.itjust.works

article: https://x.ai

trained a prototype LLM (Grok-0) with 33 billion parameters. This early model approaches LLaMA 2 (70B) capabilities on standard LM benchmarks but uses only half of its training resources. In the last two months, we have made significant improvements in reasoning and coding capabilities leading up to Grok-1, a state-of-the-art language model that is significantly more powerful, achieving 63.2% on the HumanEval coding task and 73% on MMLU.

you are viewing a single comment's thread
view the rest of the comments
this post was submitted on 05 Nov 2023
14 points (76.9% liked)

LocalLLaMA

2214 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS