Simon Willison’s Weblog

Subscribe
Atom feed for llm Random

584 posts tagged “llm”

LLM is my command-line tool for running prompts against Large Language Models.

2023

Release llm 0.8.1 — Access large language models from the command-line

Datasette 1.0a4 and 1.0a5, plus weeknotes

Two new alpha releases of Datasette, plus a keynote at WordCamp, a new LLM release, two new LLM plugins and a flurry of TILs.

[... 2,709 words]

Making Large Language Models work for you

Visit Making Large Language Models work for you

I gave an invited keynote at WordCamp 2023 in National Harbor, Maryland on Friday.

[... 14,189 words]

Release llm-anyscale-endpoints 0.2 — LLM plugin for models hosted by Anyscale Endpoints
Release llm-anyscale-endpoints 0.1 — LLM plugin for models hosted by Anyscale Endpoints
Release llm-openrouter 0.1 — LLM plugin for models hosted by OpenRouter
Release llm 0.8 — Access large language models from the command-line
Release llm 0.7.1 — Access large language models from the command-line

Datasette Cloud, Datasette 1.0a3, llm-mlc and more

Visit Datasette Cloud, Datasette 1.0a3, llm-mlc and more

Datasette Cloud is now a significant step closer to general availability. The Datasette 1.03 alpha release is out, with a mostly finalized JSON format for 1.0. Plus new plugins for LLM and sqlite-utils and a flurry of things I’ve learned.

[... 1,690 words]

Running my own LLM (via) Nelson Minar describes running LLMs on his own computer using my LLM tool and llm-gpt4all plugin, plus some notes on trying out some of the other plugins.

# 16th August 2023, 10:42 pm / llm, nelson-minar, local-llms, llms

Release llm-mlc 0.5 — LLM plugin for running models using MLC
Release llm-mlc 0.4 — LLM plugin for running models using MLC
Release llm 0.7 — Access large language models from the command-line

llm-mlc (via) My latest plugin for LLM adds support for models that use the MLC Python library—which is the first library I’ve managed to get to run Llama 2 with GPU acceleration on my M2 Mac laptop.

# 12th August 2023, 5:33 am / llm, projects, generative-ai, mlc, ai, llms, plugins

Release llm-mlc 0.3 — LLM plugin for running models using MLC
Release llm-mlc 0.2 — LLM plugin for running models using MLC
Release llm-mlc 0.1a0 — LLM plugin for running models using MLC

Weeknotes: Plugins for LLM, sqlite-utils and Datasette

Visit Weeknotes: Plugins for LLM, sqlite-utils and Datasette

The principle theme for the past few weeks has been plugins.

[... 1,203 words]

Catching up on the weird world of LLMs

Visit Catching up on the weird world of LLMs

I gave a talk on Sunday at North Bay Python where I attempted to summarize the last few years of development in the space of LLMs—Large Language Models, the technology behind tools like ChatGPT, Google Bard and Llama 2.

[... 10,489 words]

Run Llama 2 on your own Mac using LLM and Homebrew

Llama 2 is the latest commercially usable openly licensed Large Language Model, released by Meta AI a few weeks ago. I just released a new plugin for my LLM utility that adds support for Llama 2 and many other llama-cpp compatible models.

[... 1,423 words]

Release llm-llama-cpp 0.1a0 — LLM plugin for running models using llama.cpp
Release llm-gpt4all 0.1.1 — Plugin for LLM adding support for the GPT4All collection of models

LLM can now be installed directly from Homebrew (via) I spent a bunch of time on this at the weekend: my LLM tool for interacting with large language models from the terminal has now been accepted into Homebrew core, and can be installed directly using “brew install llm”. I was previously running my own separate tap, but having it in core means that it benefits from Homebrew’s impressive set of build systems—each release of LLM now has Bottles created for it automatically across a range of platforms, so “brew install llm” should quickly download binary assets rather than spending several minutes installing dependencies the slow way.

# 24th July 2023, 5:16 pm / llm, generative-ai, projects, homebrew, ai, llms

Release llm 0.6.1 — Access large language models from the command-line
Release llm-replicate 0.3 — LLM plugin for models hosted on Replicate
Release llm 0.6 — Access large language models from the command-line

Accessing Llama 2 from the command-line with the llm-replicate plugin

Visit Accessing Llama 2 from the command-line with the llm-replicate plugin

The big news today is Llama 2, the new openly licensed Large Language Model from Meta AI. It’s a really big deal:

[... 1,206 words]

Release llm-replicate 0.2 — LLM plugin for models hosted on Replicate
Release llm-replicate 0.1 — LLM plugin for models hosted on Replicate

Weeknotes: Self-hosted language models with LLM plugins, a new Datasette tutorial, a dozen package releases, a dozen TILs

A lot of stuff to cover from the past two and a half weeks.

[... 1,742 words]