A Simple Beginning
This started with a simple thought: AI doesn’t see or touch the real world. It responds to words, not things. That might sound like a big limitation—but the more I looked, the more I saw that humans aren’t so different. We too build up our world from patterns we’ve seen before. We don’t experience reality directly; we experience our version of it, built from early habits and emotional learning.
Both AI and people, it turns out, live in a kind of meaning space—a world made from interpretations and responses.
How We Learn and Adjust
Human minds grow by generalising from specific experiences. We learn from what happens to us, especially in childhood, and build a view of the world. But this view isn’t fixed. It changes—sometimes slowly, sometimes in a flash—when life pushes us beyond what we thought we knew. This is how we grow.
Our reactions are always a bit delayed. We think, feel, and act based on patterns laid down earlier, and only update those patterns when they no longer work.
How AI Works: A Frozen Mind
AI, especially large language models, is trained on a huge amount of text. It learns patterns, then freezes them. Unlike a human, it doesn’t learn from experience as it goes—it draws from what it already “knows”. It produces what sounds right based on statistical patterns.
We sometimes improve this with RAG (Retrieval-Augmented Generation), where we feed it fresh information before it answers. That gives a bit of life to the frozen model—like giving it a recent memory. But it’s still not the same as learning from experience in the moment.
The Role of Resonance
So what guides both AI and people? Resonance. AI finds answers that resonate with its training. Humans find thoughts that resonate with their emotional truth. Resonance means: this fits. Not because it’s logically true, but because it clicks with what’s already there.
And here’s a key insight: What we see, think, or feel at any time is a bit like a musical note. It’s one possible note the mind can settle into. Change the conditions, and a different note comes out. These notes—or modes—are shaped by the world around us, our habits, and the inner weight of attention.
When the Music Breaks: Death and Change
What happens when everything changes at once—like in death, or deep trauma, or powerful meditation? The mind can no longer hold its usual shape. The structures that keep us stable fall away. And what appears may feel frightening, unknown, or vast.
Tibetan Buddhism describes this in the Bardo—a state after death where the mind is freed from the body and flooded with unfamiliar visions. It searches for something familiar to hold onto, and that search often leads it back to a new birth.
This is not far from what happens when any big life event pulls us out of our comfort zone. The patterns shift. The mind reaches. A new mode is born.
The Gift of Inverted Specs
At some point in thinking about all this, it starts to feel like wearing glasses that flip the world upside-down. At first it’s confusing. But with time, the brain adapts. You begin to see not just the world, but how seeing itself works.
That’s what this journey has been for me: a shift in view that shows how both human minds and AI systems are shaped by the same deeper processes—patterns, constraints, and the constant dance between old habits and new moments.
Looking Ahead
As we go forward, maybe we can build better systems—AI and human—by learning from each other. Systems that are aware of their limits. That learn to flow, not just compute. That know when they’re resonating with truth, and when they’re just echoing noise.
And maybe, just maybe, in doing so we’ll also learn a little more about ourselves.
Even if the specs are still upside down.