Guess we could start calling this a 'hallucitation'? Kate Crawford coins an excellent neologism for hallucinated citations in LLMs like ChatGPT.
Recent articles
- LLM 0.22, the annotated release notes - 17th February 2025
- Run LLMs on macOS using llm-mlx and Apple's MLX framework - 15th February 2025
- URL-addressable Pyodide Python environments - 13th February 2025