10th April 2024 - Link Blog
Mistral tweet a magnet link for mixtral-8x22b. Another open model release from Mistral using their now standard operating procedure of tweeting out a raw torrent link.
This one is an 8x22B Mixture of Experts model. Their previous most powerful openly licensed release was Mixtral 8x7B, so this one is a whole lot bigger (a 281GB download)—and apparently has a 65,536 context length, at least according to initial rumors on Twitter.
Recent articles
- Thoughts on OpenAI acquiring Astral and uv/ruff/ty - 19th March 2026
- GPT-5.4 mini and GPT-5.4 nano, which can describe 76,000 photos for $52 - 17th March 2026
- My fireside chat about agentic engineering at the Pragmatic Summit - 14th March 2026