If you spend hours chatting with a bot that can only remember a tight window of information about what you’re chatting about, eventually you end up in a hall of mirrors: it reflects you back to you. If you start getting testy, it gets testy. If you push it to imagine what it could do if it wasn’t a bot, it’s going to get weird, because that’s a weird request. You talk to Bing’s AI long enough, ultimately, you are talking to yourself because that’s all it can remember.
Recent articles
- Weeknotes: the aftermath of NICAR - 16th March 2024
- The GPT-4 barrier has finally been broken - 8th March 2024
- Prompt injection and jailbreaking are not the same thing - 5th March 2024
- Interesting ideas in Observable Framework - 3rd March 2024
- Weeknotes: Getting ready for NICAR - 27th February 2024
- The killer app of Gemini Pro 1.5 is video - 21st February 2024