14
... (programming.dev)
submitted 10 months ago* (last edited 7 months ago) by CoderSupreme@programming.dev to c/localllama@sh.itjust.works

article: https://x.ai

trained a prototype LLM (Grok-0) with 33 billion parameters. This early model approaches LLaMA 2 (70B) capabilities on standard LM benchmarks but uses only half of its training resources. In the last two months, we have made significant improvements in reasoning and coding capabilities leading up to Grok-1, a state-of-the-art language model that is significantly more powerful, achieving 63.2% on the HumanEval coding task and 73% on MMLU.

all 3 comments
sorted by: hot top controversial new old
[-] mateomaui@reddthat.com 3 points 10 months ago

Grok is an AI modeled after the Hitchhiker’s Guide to the Galaxy, so intended to answer almost anything and, far harder, even suggest what questions to ask!

hilarious considering the opinions that he completely missed the point of the book.

[-] noneabove1182@sh.itjust.works 2 points 10 months ago* (last edited 10 months ago)

While the drama around X and musk cannot be understated, it's still great to see more players in the open model world (assuming this gets properly opened)

One thing that'll hold it back (for people like us at least) is developer support so I'm quite curious to see how this plays out with things like GPTQ and llama.cpp

this post was submitted on 05 Nov 2023
14 points (76.9% liked)

LocalLLaMA

2214 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS