OpenLLaMA. The first openly licensed model I’ve seen trained on the RedPajama dataset. This initial release is a 7B model trained on 200 billion tokens, but the team behind it are promising a full 1 trillion token model in the near future. I haven’t found a live demo of this one running anywhere yet.
Recent articles
- Live blog: Claude 4 launch at Code with Claude - 22nd May 2025
- I really don't like ChatGPT's new memory dossier - 21st May 2025
- Building software on top of Large Language Models - 15th May 2025