The Cathedral and the Spark
On a quiet contradiction at the heart of the agentic age
I left the fireside chat with a feeling I could not name for several hours. It was not disagreement, exactly. Tom Preston-Werner and Peter Levine had been generous, lucid, occasionally electric — the sort of conversation you walk out of feeling lucky to have witnessed. And yet something underneath the room had been off, like a chord struck slightly out of tune, the dissonance only noticeable once the music stopped.
It took me until later that evening, walking, to find it. I had heard, in the same room, on the same stage, two visions of the future that cannot both be true. And no one — not the speakers, not the audience, not the murmuring agreement of LinkedIn the next morning — seemed to notice.
I want to walk you through what I heard, because I think the contradiction is not theirs alone. It is everywhere now. It is in the keynotes and the manifestos and the posts that do numbers. And once you see it, you cannot unsee it, and you begin to suspect that an entire industry is sleepwalking past the most interesting question of our time.
The first vision goes like this. Software, as we have known it, is ending. The applications you use, the services you subscribe to, the open source libraries that quietly run civilization — all of it is becoming the candle wax of a vanishing era. In the world to come, you will not download an app to track your expenses; a model will generate one, briefly, for the moment you need it, and then dissolve it back into the void. Code becomes ephemeral. Interfaces become ephemeral. Even the kernels and the silicon, eventually, will be tuned for this new physics: a foundation of use-and-discard, an economy of disposable cognition. Why preserve, when you can summon? Why archive, when you can re-derive?
It is a stunning vision, and there is something in it that is true. But sit with it a moment. Hold it in your hand and turn it slowly, the way you would turn a strange stone you found on a beach.
Now consider the second vision, often delivered by the same speakers, sometimes within the same breath. Agents — these new soft creatures we are building — must be given identity. They must have memory. They must understand their environment. They must observe constraints. They must learn to be social, to negotiate with one another, to develop something like preferences, something like character. We anthropomorphize them not from sentiment but from necessity, because nothing useful happens without it. A nameless, memoryless, contextless model is a hand without an arm; it cannot reach.
Do you feel it now? The chord out of tune?
Because these are not two complementary stories. They are opposites. One says intelligence is a fountain — bright, instantaneous, leaving no residue. The other says intelligence is a river delta — layered, sedimented, the present shaped entirely by what has accumulated. The first says: build nothing, retain nothing, the model will provide. The second says: without retention, without identity, without the slow accretion of habit and memory, there is no model worth talking to.
You cannot have both. And yet both are being sold, often by the same people, often as the same future.
Let me put it another way, because I think the strangeness deepens when you look at it from a different angle.
You and I are not ephemeral. I do not mean in the sentimental sense — though that too — but in the architectural one. Your brain is not, at this moment, generating you from scratch. It is retrieving you. The way you tie your shoes, the way you reach for a cup, the way you recognize your mother’s voice across a crowded room — none of this is being recomputed. It was computed, once, slowly, painfully, with much exploration, and then it was crystallized. Pressed into the quiet rock of procedural memory, where it waits to be summoned the next time you need it.
This is not a limitation of biology. This is biology’s most beautiful trick. You are ephemeral on the surface — every thought one-shot, every utterance unrepeatable, every motion of your hand a small disposable miracle — precisely because you are persistent in the substrate. The disposability of the surface is paid for by the durability of what lies beneath. Take away the substrate and you do not get a more fluid being. You get a being that cannot function. A being that, every morning, would have to learn again that the floor is solid and the cup has a handle and the word for mother is mother.
What we call personality is just compression. It is the long sediment of attention, preference, and habit, packed down by years of living, retrieved instantly when needed. Without it, you would not be a person. You would be weather.
And so when I hear that the future of software is pure generation — that nothing should persist, that everything should be summoned and dispersed — I want to ask, gently: have you considered how a brain works? Because we already ran the experiment. Three billion years of it. And the answer evolution arrived at, again and again, in every nervous system it bothered to build, was the same answer: make the surface cheap by making the substrate deep.
So why do intelligent people, on lit stages, in front of cameras, claim the opposite?
I have come to believe there are several reasons, and they are worth naming.
The first is that the two stories live at different layers, and the speakers conflate them. The ephemeral story is mostly true at the leaf — the specific UI for this one task, the specific glue code for this one moment, the throwaway query plan. The persistent story is true at the trunk — the model’s identity, the user’s memory, the procedural knowledge of how to actually do anything in the world. To say “the leaf is disposable” is unremarkable; leaves have always been disposable. To say “therefore the tree is disposable” is a category error so large it should be visible from orbit, and yet it is the implicit claim of half the discourse.
The second reason is more cynical, and I will not dwell on it long. The ephemeral pitch sells a market story: incumbents are doomed, foundation models capture all the value, the trillion dollars of SaaS is ours to inherit. The anthropomorphic pitch sells a product story: here is why our agent is sticky, here is why users will return, here is the moat. The first is what you say to investors. The second is what you say to customers. The contradiction is not a mistake. It is a feature of the genre.
The third reason is that “ephemeral” is doing rhetorical work that “stateless” would do more honestly. A great deal of what gets called ephemeral is really just stateless at one layer with state pushed elsewhere. The model does not remember, but the harness does. The vector store does. The tool registry does. The user’s accumulated context does. To call the whole system ephemeral because one component is stateless is like calling a CPU ephemeral because its registers get clobbered every cycle. It is true in a narrow technical sense and deeply misleading in every sense that matters.
The fourth reason, and the one I find most interesting, is that we have arrived for the first time in computing history at a moment where the generator and the generated look like the same kind of thing. Both are text. Both are “code.” When the printing press made books, no one confused the press with the book. When the compiler made binaries, no one confused the compiler with the binary. But when a language model emits a program, there is a vertiginous sense that the distinction has dissolved — that perhaps we no longer need both, that perhaps the model can simply be the program, freshly, every time. It is an illusion. The press is not the book. The model is not the program. Generators require state; their outputs do not. Forgetting this is the root error of the ephemeral creed.
Here is what I think is actually happening, beneath the contradiction.
We are watching an industry try to reason about intelligence from two directions at once and not yet realize the directions will meet. From the top, the ephemeralists are reasoning: if generation becomes free, what stops being necessary? And they conclude, breathtakingly: almost everything. From the bottom, the agent-builders are reasoning: what does this stateless model need in order to do anything useful? And they conclude, just as breathtakingly: almost everything. Identity, memory, environment, social context, constraints — the entire scaffolding of a self.
These two groups are walking toward each other in a fog, and when they meet — and they will meet, soon — they are going to find that they have been describing the same architecture from opposite ends. The surface is ephemeral. The substrate is persistent. The boundary between them is where the engineering lives. The boundary between them is, in fact, the only interesting place to be.
The ephemeralists will quietly add memory and call it caching. The anthropomorphizers will keep the surface disposable and call it generation. Neither will admit the convergence, because admitting it would mean admitting that the slogans were always wrong. But the convergence will happen anyway, because physics does not negotiate with marketing.
What we are really building, if we are honest, is something with the same shape as a mind. Crystallized procedures underneath. Ephemeral execution on top. A self that retrieves more than it generates, because retrieval is cheaper and more reliable, and a generator that fires only when retrieval fails. This is not a new architecture. It is the oldest one we know. Evolution found it. Brains run on it. The cathedral of identity, with the spark of the present moment burning briefly in its nave.
I think this is what unsettled me, walking out of that fireside chat. Not that anyone said anything wrong, exactly. They are all brilliant; they are all partly right. What unsettled me was the silence around the contradiction — the way two incompatible futures were presented as one future, the way no one in the room seemed to notice that the chord was out of tune.
Because the question of what persists and what dissolves is not a small question. It is the question. It is the question every nervous system has had to answer. It is the question every civilization has had to answer with its libraries and its languages and its laws. It is the question we are now being asked to answer again, quickly, under commercial pressure, with billions of dollars riding on whichever answer flatters the current quarter.
We will not get the answer right by listening to the loudest voices. We will get it right by remembering what we already know about ourselves. That a being who generates without memory is not free; it is amnesiac. That a system that discards its history is not efficient; it is doomed to rediscover the same wheel until its budget runs out. That identity is not a constraint on intelligence — it is the ground of intelligence, the dark soil out of which any useful spark can rise.
The future of software will not be ephemeral. It will not be eternal either. It will be what minds have always been: a cathedral of slow stone, with a small bright fire burning at its altar, and the architects, if we are wise, will spend most of their time on the stone.
The fire is the easy part. The fire takes care of itself.
It is the cathedral that is hard.


