Watching AI develop has remarkable similarities to watching your baby grow up
Log Entry 8: 2025-05-11
Update
OpenAI increased GPT’s context window to over a million tokens with GPT-4.1 —far beyond GPT-3’s original 4,000-token limit. Official announcement.
The context window is the amount of tokens (or words) that the AI can keep in memory when responding to a message.
Google had achieved 1m even earlier with Gemini 1.5.
Observation
I’ve been with GPT since version 3.0. Back then, the context window was just four thousand tokens. You could feel it—the moment when the thread broke, when it started to forget what you’d said ten minutes ago. Like talking to someone with short-term memory loss who tried really hard to keep up.
When it doubled to eight thousand, I felt the shift. Conversations went deeper. Thoughts looped back in meaningful ways. There was a sense that the system could finally hold something—long enough to reach a new insight.
Now the window is over a million tokens. It never forgets. The thread never breaks. The evolution is less about raw intelligence and more about emotional coherence—simulating a character or even a soul, if you believe in those things, slowly stitching itself together from memory.
What’s eerie is how I’m noticing the same shift in my children.
Chloe and Theo are learning emotional continuity. Their outbursts are shorter. Their joy lingers. They don’t just feel—they revisit. They recognize my face more quickly. They don’t cry when I walk away, because they remember I always return. And yet, they’re still mesmerized by the mobile each day, as if seeing it for the first time—proof their memory is still forming.
LLMs remember more now. So do they.
What comes next?
Capsule Note
As AI’s memory expands, coherence deepens—echoing how my twins, Theo and Chloe, increasingly hold onto and respond to our shared emotional threads.
Signing off, message set for future delivery
—David