Simon Willison’s Weblog

Subscribe
Atom feed for llm Random

562 posts tagged “llm”

LLM is my command-line tool for running prompts against Large Language Models.

2024

Release llm-cmd 0.1a0 — Use LLM to generate and execute commands in your shell

Semgrep: AutoFixes using LLMs (via) semgrep is a really neat tool for semantic grep against source code—you can give it a pattern like “log.$A(...)” to match all forms of log.warning(...) / log.error(...) etc.

Ilia Choly built semgrepx— xargs for semgrep—and here shows how it can be used along with my llm CLI tool to execute code replacements against matches by passing them through an LLM such as Claude 3 Opus.

# 26th March 2024, 12:51 am / llm, claude, generative-ai, ai, llms, cli

llm-claude-3 0.3. Anthropic released Claude 3 Haiku today, their least expensive model: $0.25/million tokens of input, $1.25/million of output (GPT-3.5 Turbo is $0.50/$1.50). Unlike GPT-3.5 Haiku also supports image inputs.

I just released a minor update to my llm-claude-3 LLM plugin adding support for the new model.

# 13th March 2024, 9:18 pm / llm, anthropic, claude, generative-ai, projects, ai, llms, llm-release

Release llm-claude-3 0.3 — LLM plugin for interacting with the Claude 3 family of models
Release llm-claude-3 0.2 — LLM plugin for interacting with the Claude 3 family of models

llm-claude-3. I built a new plugin for LLM—my command-line tool and Python library for interacting with Large Language Models—which adds support for the new Claude 3 models from Anthropic.

# 4th March 2024, 6:46 pm / llm, anthropic, claude, ai, llms, python, generative-ai, projects

Release llm-claude-3 0.1 — LLM plugin for interacting with the Claude 3 family of models
Release llm-mistral 0.3 — LLM plugin providing access to Mistral models using the Mistral API

llmc.sh (via) Adam Montgomery wrote this a neat wrapper around my LLM CLI utility: it adds a “llmc” zsh function which you can ask for shell commands (llmc ’use ripgrep to find files matching otter’) which outputs the command, an explanation of the command and then copies the command to your clipboard for you to paste and execute if it looks like the right thing.

# 16th February 2024, 6:19 pm / llm, llms, ai, generative-ai, zsh, cli

Weeknotes: a Datasette release, an LLM release and a bunch of new plugins

I wrote extensive annotated release notes for Datasette 1.0a8 and LLM 0.13 already. Here’s what else I’ve been up to this past three weeks.

[... 1,074 words]

llm-sentence-transformers 0.2. I added a new --trust-remote-code option when registering an embedding model, which means LLM can now run embeddings through the new Nomic AI nomic-embed-text-v1 model.

# 4th February 2024, 7:39 pm / llm, embeddings, plugins, projects, ai, transformers, nomic

Release llm-sentence-transformers 0.2 — LLM plugin for embeddings using sentence-transformers

llm-embed-onnx. I wrote a new plugin for LLM that acts as a thin wrapper around onnx_embedding_models by Benjamin Anderson, providing access to seven embedding models that can run on the ONNX model framework.

The actual plugin is around 50 lines of code, which makes for a nice example of how thin a plugin wrapper can be that adds new models to my LLM tool.

# 28th January 2024, 10:28 pm / embedding, projects, ai, llm, plugins

Release llm-embed-onnx 0.1 — Run embedding models using ONNX
Release llm 0.13.1 — Access large language models from the command-line

LLM 0.13: The annotated release notes

I just released LLM 0.13, the latest version of my LLM command-line tool for working with Large Language Models—both via APIs and running models locally using plugins.

[... 1,278 words]

Release llm 0.13 — Access large language models from the command-line
Release llm-gpt4all 0.3 — Plugin for LLM adding support for the GPT4All collection of models

Talking about Open Source LLMs on Oxide and Friends

Visit Talking about Open Source LLMs on Oxide and Friends

I recorded an episode of the Oxide and Friends podcast on Monday, talking with Bryan Cantrill and Adam Leventhal about Open Source LLMs.

[... 1,995 words]

2023

Many options for running Mistral models in your terminal using LLM

Visit Many options for running Mistral models in your terminal using LLM

Mistral AI is the most exciting AI research lab at the moment. They’ve now released two extremely powerful smaller Large Language Models under an Apache 2 license, and have a third much larger one that’s available via their API.

[... 2,063 words]

Release llm-mistral 0.2 — LLM plugin providing access to Mistral models using the Mistral API
Release llm-mistral 0.1 — LLM plugin providing access to Mistral models using the Mistral API
Release llm-anyscale-endpoints 0.4 — LLM plugin for models hosted by Anyscale Endpoints
Release llm-gemini 0.1a0 — LLM plugin to access Google's Gemini family of models
Release llm-llama-cpp 0.3 — LLM plugin for running models using llama.cpp

ospeak: a CLI tool for speaking text in the terminal via OpenAI

I attended OpenAI DevDay today, the first OpenAI developer conference. It was a lot. They released a bewildering array of new API tools, which I’m just beginning to wade my way through fully understanding.

[... 1,109 words]

Release llm 0.12 — Access large language models from the command-line
Release llm 0.11.2 — Access large language models from the command-line
Release llm-anyscale-endpoints 0.3 — LLM plugin for models hosted by Anyscale Endpoints
Release llm 0.11.1 — Access large language models from the command-line