Weeknotes: Datasette alphas for testing new plugin hooks
19th June 2020
A relatively quiet week this week, compared to last week’s massive push to ship Datasette 0.44 with authentication, permissions and writable canned queries. I can now ship alpha releases, such as today’s Datasette 0.45a1, which means I can preview new plugin features before they are completely ready and stable.
Datasette alphas and betas
I want to be able to iterate on plugin hooks more effectively, which means embracing release early, release often. I also want plugin authors to be able to trust Datasette not to break their work—a major focus for Datasette 1.0.
Releasing alpha preview versions can really help here. I have two plugin hooks in the pipeline for Datasette 0.45 already: startup and canned_queries. These are still under development but are now available to preview in Datasette 0.45a1. Install it like so:
pip install datasette==0.45a1
Please post any feedback on the design of these hooks to the startup hook or canned_queries hook issue threads.
Another compelling reason to ship alphas: it means I can release alpha versions of my own plugins that themselves depend on a Datasette alpha, and still have their unit tests pass in continuous integration. I expect to take advantage of that ability a lot in the future.
Figuring out how to safely ship an alpha took a little bit of work, because I wanted to make sure that alphas wouldn’t become the default version installed from PyPI, mess up my existing CI configuration or end up accidentally published to Docker Hub. You can follow my research on this in this issue, including my experiments shipping an alpha for datasette-render-images first.
Those new plugin hooks
startup()
(documentation) is a hook that gets called on Datasette server startup, and passed the datasette
object. The initial use-case was plugins that might want to validate their own configuration, but I imagine other interesting uses for it will emerge over time.
canned_queries()
(documentation) lets plugin authors dynamically generate new canned queries for a given database. It gets passed the datasette object, the current database name and the currently authenticated actor, if there is one. This means you can do fun things like include a user’s own saved queries loaded from another database table:
from datasette import hookimpl
@hookimpl
def canned_queries(datasette, database, actor):
async def inner():
db = datasette.get_database(database)
if actor is not None and await db.table_exists("saved_queries"):
results = await db.execute(
"select name, sql from saved_queries where actor_id = :id", {
"id": actor["id"]
}
)
return {result["name"]: {
"sql": result["sql"]
} for result in results}
return inner
I’m using a pattern here that’s shared by a number of other Datasette plugin hooks: rather than returning the results directly, this plugin function returns an async def inner()
function.
The code that calls the hook knows that if an asyncio
awaitable function is returned it should await
it. This is my trick for using awaitable functions with Pluggy, which wasn’t initially built with async in mind.
Shooting naturalist videos with Natalie
Natalie has started a new YouTube channel to put her various science communication courses at Stanford into action. I’ve been helping out as camera-person, which has been really interesting. I’m currently shooting with FiLMiC running on an iPhone 11 Pro on a tripod, using audio from an AirPod (until we can get our hands on something better).
Natalie’s been editing the videos on her iPhone and these early results are really good! Here’s the video we shot for Sea Lion Birthday on 15th June, a day when 50% of all California Sea Lions celebrate their birthday. Watch the video to find out why.
The close-up footage of the sea lions was shot by Natalie on a Canon DSLR with a 100-400mm lens. I love that lens so much for wildlife photography.
TIL this week
Just one new TIL this week but it’s a good one: Using LD_PRELOAD to run any version of SQLite with Python. I’ve been wanting to figure out a good way to replace the SQLite version used by the Python standard library for ages—pysqlite3 helps a lot here, but I also need the ability to run arbitrary older versions to help respond to bug reports. The LD_PRELOAD
trick works perfectly for that.
More recent articles
- Notes from Bing Chat—Our First Encounter With Manipulative AI - 19th November 2024
- Project: Civic Band - scraping and searching PDF meeting minutes from hundreds of municipalities - 16th November 2024
- Qwen2.5-Coder-32B is an LLM that can code well that runs on my Mac - 12th November 2024