When we started working on what became NotebookLM in the summer of 2022, we could fit about 1,500 words in the context window. Now we can fit up to 1.5 million words. (And using various other tricks, effectively fit 25 million words.) The emergence of long context models is, I believe, the single most unappreciated AI development of the past two years, at least among the general public. It radically transforms the utility of these models in terms of actual, practical applications.
Recent articles
- Six short video demos of LLM and Datasette projects - 22nd January 2025
- DeepSeek-R1 and exploring DeepSeek-R1-Distill-Llama-8B - 20th January 2025
- My AI/LLM predictions for the next 1, 3 and 6 years, for Oxide and Friends - 10th January 2025