Simon Willison’s Weblog

Subscribe

16th October 2025

Tool OpenAI Prompt Caching Playground — Explore OpenAI's prompt caching feature by testing different prompt structures and observing cache hit rates across multiple requests. This interactive playground lets you compose system instructions, document context, and user questions, then send them to the Chat Completions or Responses API to track how cached tokens reduce costs and improve performance.

This is a beat by Simon Willison, posted on 16th October 2025.

Monthly briefing

Sponsor me for $10/month and get a curated email digest of the month's most important LLM developments.

Pay me to send you less!

Sponsor & subscribe