Simon Willison’s Weblog

Subscribe

AI Hallucination Cases (via) Damien Charlotin maintains this database of cases around the world where a legal decision has been made that confirms hallucinated content from generative AI was presented by a lawyer.

That's an important distinction: this isn't just cases where AI may have been used, it's cases where a lawyer was caught in the act and (usually) disciplined for it.

It's been two years since the first widely publicized incident of this, which I wrote about at the time in Lawyer cites fake cases invented by ChatGPT, judge is not amused. At the time I naively assumed:

I have a suspicion that this particular story is going to spread far and wide, and in doing so will hopefully inoculate a lot of lawyers and other professionals against making similar mistakes.

Damien's database has 116 cases from 12 different countries: United States, Israel, United Kingdom, Canada, Australia, Brazil, Netherlands, Italy, Ireland, Spain, South Africa, Trinidad & Tobago.

20 of those cases happened just this month, May 2025!

I get the impression that researching legal precedent is one of the most time-consuming parts of the job. I guess it's not surprising that increasing numbers of lawyers are returning to LLMs for this, even in the face of this mountain of cautionary stories.

Recent articles

Monthly briefing

Sponsor me for $10/month and get a curated email digest of the month's most important LLM developments.

Pay me to send you less!

Sponsor & subscribe