Simon Willison’s Weblog

Subscribe

Blogmarks tagged opensearch in Aug, 2023

Filters: Type: blogmark × Year: 2023 × Month: Aug × opensearch × Sorted by date


airoboros LMoE. airoboros provides a system for fine-tuning Large Language Models. The latest release adds support for LMoE—LoRA Mixture of Experts. GPT-4 is strongly rumoured to work as a mixture of experts—several (maybe 8?) 220B models each with a different specialty working together to produce the best result. This is the first open source (Apache 2) implementation of that pattern that I’ve seen. # 24th August 2023, 10:31 pm

Types

Years

Months

Tags