Hallucinations = creativity. It [Bing] tries to produce the highest probability continuation of the string using all the data at its disposal. Very often it is correct. Sometimes people have never produced continuations like this. You can clamp down on hallucinations - and it is super-boring. Answers "I don't know" all the time or only reads what is there in the Search results (also sometimes incorrect). What is missing is the tone of voice: it shouldn't sound so confident in those situations.
Recent articles
- Run LLMs on macOS using llm-mlx and Apple's MLX framework - 15th February 2025
- URL-addressable Pyodide Python environments - 13th February 2025
- Using pip to install a Large Language Model that's under 100MB - 7th February 2025