If you spend hours chatting with a bot that can only remember a tight window of information about what you're chatting about, eventually you end up in a hall of mirrors: it reflects you back to you. If you start getting testy, it gets testy. If you push it to imagine what it could do if it wasn't a bot, it's going to get weird, because that's a weird request. You talk to Bing's AI long enough, ultimately, you are talking to yourself because that's all it can remember.
Recent articles
- 2025: The year in LLMs - 31st December 2025
- How Rob Pike got spammed with an AI slop "act of kindness" - 26th December 2025
- A new way to extract detailed transcripts from Claude Code - 25th December 2025