Falcon 2 11B Unveiled!
Exciting news from the Technology Innovation Institute as they introduce the inaugural member of the Falcon 2 family - Falcon 2 11B! This cutting-edge model represents a significant leap in AI capability, boasting a dense decoder architecture trained on a staggering 5.5 trillion tokens. With a robust Vision Language Model on the horizon, Falcon 2 11B is poised to revolutionize the field. Here's a quick overview: equipped with 11B parameters and a context length of 8K tokens, it supports multiple languages including English, French, Spanish, German, and Portuguese. Performance metrics are equally impressive, scoring 58.37 on the Multimodal Language Understanding (MMLU) benchmark and 59.74 on ARC-C. Notably, it outperforms its predecessor, Llama 3 8B, on TruthfulQA and GSM8K tasks. Commercial use is permitted, and it's conveniently accessible on Hugging Face. While the Vision Language Model is yet to be unveiled, the debut of Falcon 2 11B marks an exciting milestone in AI advancement. Stay tuned for updates on this groundbreaking technology!