Elon Musk Unveils Grok-1: A 314B MoE Transformer

No Image
No Image
Source Link

Elon Musk has delivered on his promise, unveiling Grok-1, a groundbreaking Mixture-of-Experts (MoE) transformer. This base model, released under the Apache 2.0 license, boasts an impressive 314 billion parameters, with 25% activation on a token level. According to the initial announcement, Grok-1 achieves remarkable performance metrics, with 73% on MMLU, 62.9% on GMSK, and 63.2% on HumanEval, marking a significant advancement in the field of natural language processing.