openelm/README-pretraining.md. Apple released something big three hours ago, and I’m still trying to get my head around exactly what it is.
The parent project is called CoreNet, described as “A library for training deep neural networks”. Part of the release is a new LLM called OpenELM, which includes completely open source training code and a large number of published training checkpoint.
I’m linking here to the best documentation I’ve found of that training data: it looks like the bulk of it comes from RefinedWeb, RedPajama, The Pile and Dolma.
Recent articles
- Qwen 3 offers a case study in how to effectively release a model - 29th April 2025
- Watching o3 guess a photo's location is surreal, dystopian and wildly entertaining - 26th April 2025
- Exploring Promptfoo via Dave Guarino's SNAP evals - 24th April 2025