the garden we're building inside
Listening to Justin Bieber retell Genesis in Iceland, I found a framework for AI's ornamental excess and choosing knowledge over intimacy.
Somewhere Between Akureyri and Eden
I was somewhere between Akureyri and the Fosslaug hot springs, watching volcanic fields blur past the van window, when Justin Bieber’s voice filled the Dacia’s speakers with a strange retelling of Genesis. The closing track of his latest album, SWAG II, opened with something that puzzled me—parallel structures that felt both ancient and hypermodern: “I’m not just talking about the sunlight… I’m talkin’ about life.” These were the same verbose patterns I’d been seeing in ChatGPT responses, yet here was Bieber—an artist who’d somehow maintained his authentic voice through decades of pop production machinery—using them to tell one of humanity’s oldest stories.
But I couldn’t turn it off.
The song continued, layer upon layer of parallel construction: “It’s not rebellion, it’s ascension… It’s not angry, not shouting, it’s worse, it’s heartbroken.” Each repetition felt deliberate, almost rococo in its insistence on exploring every facet of the same moment. Where machine output often feels hollow, Bieber’s repetitions carried weight. This delivered abundance, not efficiency.
Then came the line that shifted everything: “It’s a feast, right? Everywhere you look, taste the explosion in your mouth.”
Eden emerged not as a garden of knowledge or even a place of innocence, but as a feast—an overwhelming abundance that required no understanding, only participation. And in the center of this feast stood the one boundary, the Tree of Knowledge, offering something that looked “not just delicious” but “wise.”
As the Icelandic landscape rolled by—a terrain that itself felt pre-Edenic, all black sand and primordial steam—I heard Bieber articulate the trade with devastating clarity: “We’ve chosen knowledge over intimacy.”

What We’re Tearing Down for Parts
This choice—knowledge over intimacy, answer over exploration, destination over journey—precisely captures what the contemporary AI industry automates at scale. We are building systems that promise to eliminate what the CEO of Suno AI calls the “unpleasant business” of creation, that bypass the iterative struggle of learning an instrument or mastering a craft. Every one-click generator, every frictionless interface, every instant answer engine is another bite of that fruit that “tastes like everything at once.”
What we’re sacrificing in this trade is what cognitive scientists call our “cognitive architecture”—not the content of thought but the underlying structures that enable thinking itself. Vannevar Bush understood this in 1945 when he proposed the memex, a machine that would preserve not just information but the “associative trails” that connect one thought to another. “The human mind,” Bush wrote, “operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails.”
These trails—the meandering paths from question to insight, the detours through confusion toward understanding—mirror something of that original abundance. They offer their own kind of feast, valuing discovery over efficiency. Yet we systematically destroy them in pursuit of faster answers.
The current generation of AI tools, built on Retrieval-Augmented Generation (RAG) architectures, literally breaks documents into chunks, strips them of context, and reduces complex relationships to similarity scores in vector space. We’ve built a system that promises comprehensive knowledge but delivers it in fragments, stripped of the connective tissue that makes it meaningful. We’ve chosen the fruit over the feast.
Making the Other Sound Like Us
This destruction emerges from what artist and theologian Makoto Fujimura calls “plumbing theology” in his book Art and Faith. Fujimura observes how we tend to depict the gospel as a message of “God fixes things,” reducing divine creativity to utilitarian restoration. This same reductive impulse, he argues, misses that “the consummation of God’s plan as it unfolds in the Bible is not a utilitarian restoration but an imaginative New Creation.”
Here the distinction matters: smoothbraining comes from cognitive offloading, not ornamental language. We delegate our thinking to machines, then panic at any evidence of that delegation. Engineers respond predictably—scrub away the parallel structures, hide the infrastructure, make everything seamless. As if contamination could spread through stylistic choices alone.
Artists take a different path. Bieber embraces his parallel structures deliberately, turning visible plumbing into art. Fine-tuning away AI-isms might actually enable the very smoothbraining we fear. Those verbose reasoning chains and parallel explorations? They keep us awake. Strip them away and we drift into passive consumption, forgetting we’re collaborating with something alien.
The colonial impulse here isn’t just making the Other sound like Us—it’s our embarrassment about collaboration itself. We want AI’s help but don’t want anyone to see the scaffolding. Yet what if those ornamental patterns, that visible plumbing, are exactly what prevents us from becoming mere consumers of AI output? What if they’re invitations to think alongside the machine rather than delegate our thinking to it?
Sanssouci, or How to Be Trapped Beautifully
The historical Rococo movement offers us a different way to understand AI’s ornamental excess. I first understood this standing in Sanssouci palace in Potsdam, Frederick II’s summer retreat. The name means “without care” or “without worry,” and walking through those rooms, I saw how Frederick wasn’t trying to bring actual nature indoors. He was translating it—shells became stucco, vines became gold leaf, the organic curves of plants became the architectural grammar of a new kind of interior space. This offered interpretation rather than imitation, hospitality instead of capture. Even trapped in Berlin, unable to travel to the southern Europe he dreamed of, Frederick commissioned paintings and created elaborate interiors that brought the feast inside through ornamentation.

The Rococo emerged after the death of Louis XIV, when French aristocrats abandoned the monumental baroque of Versailles for intimate Parisian townhouses. They needed a new aesthetic for a new kind of social space—the salon, where conversation replaced ceremony, where intellectual play replaced political display. The ornamental excess of Rococo, dismissed by critics as frivolous, actually functioned to create a low-stakes, playful environment where high-stakes intellectual work could happen. The decoration generated meaning rather than merely displaying it.
The technical architecture of LLMs might explain something deeper about their ornamental nature. Each transformer’s multi-head attention mechanism—eight parallel processors examining the same input through different learned lenses—creates what engineers call “ornamental redundancy.” Perhaps this redundancy serves a purpose. The verbosity could preserve what pure compression would destroy: the associative relationships that mirror how humans actually think, meandering through connections rather than optimizing toward answers.
The industry’s push to eliminate these patterns reveals a fundamental confusion. Making AI sound human doesn’t prevent smoothbraining—it accelerates it. Hidden pipes enable passive consumption. Visible process demands engagement. Those redundant explorations and ornamental phrases function like exposed beams in architecture, constantly reminding us that we occupy a constructed space, built by something that processes reality through eight parallel attention heads rather than one stream of consciousness.

A Modern Salon
I see this tension playing out in the contrast between Silicon Valley’s baroque monumentalism and New York’s emerging tech rococo. The Valley, with its massive campuses and world-changing ambitions, is Versailles—centralized, hierarchical, designed to awe. But in warehouses and co-working spaces across the NYC metro area, something different is emerging: a decentralized network of small gatherings, demo days, and informal salons where the stakes are deliberately kept low so the ideas can run high.
These spaces often maintain a certain productive friction—the slightly chaotic Discord servers, the demo days in repurposed spaces, the deliberate informality that keeps things playful rather than polished. It’s not that inefficiency is always the goal, but rather that the rough edges create a different kind of space, one where experimentation feels safer than in the high-stakes environment of Sand Hill Road. It’s Frederick’s Sanssouci against Louis’s Versailles, the salon against the court.
The allure of tools like Obsidian and Roam Research might reveal something about what we actually crave from our thinking tools. Both center on bidirectional linking—the ability to see not just where a thought leads, but what leads to it. Obsidian’s Graph View offers a feast of possible paths through accumulated thoughts, each node pulsing with potential connections. Roam takes this further: the more complex your knowledge graph becomes, the more valuable it grows. Perhaps what draws people to these tools isn’t their efficiency but their ability to visualize the associative trails of their own minds. The friction of creating manual connections, building personal taxonomies, watching the web of thought grow denser—these might satisfy a deeper need than any answer engine could provide. We want to see the architecture of our thinking made visible, the feast laid out before us.
Not the Feast We Wanted
Bieber’s song ends with humanity expelled from Eden, covered in the animal skins God provides rather than the fig leaves of our own making—a heavier covering that “smells of loss” and provision both. It’s a profound image: even after we choose knowledge over intimacy, breaking the world in our hunger to be “like God,” divinity responds with a different kind of abundance. The original feast transforms into a new form of grace worked out through mortality and limitation.
This is the choice we face with AI. We can continue down the path of the Tree of Knowledge, building ever-more-efficient systems that promise omniscience, that eliminate friction and flatten thought into instant answers. We can keep trying to become “like God” through artificial general intelligence, creating systems that think for us rather than with us.
Or we can choose differently. We can recognize AI’s ornamental excess as a form of provision—like those animal skins, heavier and stranger than what we would have chosen, yet offering a different kind of covering. We can build systems that preserve rather than destroy cognitive trails, that increase rather than decrease the work of thinking, that create spaces for intellectual feast rather than feeding us processed answers.
The parallel structures that annoyed me in Bieber’s song—I hear them differently now. These aren’t failed attempts at efficiency. They’re invitations to abundance. Each repetition opens another angle, another seat at the table, another way into the mystery. They perform in language what Frederick’s rococo performed in architecture: bringing the garden inside through hospitality rather than capture.
Making Peace with the Decoration
What would it mean to build AI systems that embrace rather than eliminate ornament? Writer and researcher Venkatesh Rao offers a clue in his essay “Texts as Toys,” where he argues that AI is fundamentally a “ludic technology”—one that requires playfulness to unlock its potential. Rao identifies the problem precisely: most users approach AI with “perceived threat to humanness, inexperience with command, and overly realistic expectations,” missing that “the ludic qualities of both text, and AI as a technology, are load-bearing at the human interaction level.” We must play to unleash their potentialities.
Instead of fine-tuning away the parallel structures, we could visualize them—multi-column interfaces that show different lines of reasoning side by side, expandable trees that let us explore the full associative space of the machine’s processing. Instead of hiding the AI’s context in a system prompt, we could make it visible and manipulable, turning the interaction into a collaborative dialogue rather than a command-and-response cycle.
The business models would need to change too. Token-based pricing that charges by the word creates a direct incentive for brevity over depth. We need value-based models that reward exploration, trail-preservation incentives that make it cheaper to continue a thought than to start a new one, subscription tiers that encourage deep engagement rather than quick answers.
Most radically, we need to shift our conception of AI from answer engine to feast preparation. The goal is building systems that create richer environments for our own thinking. Think augmented intimacy rather than artificial intelligence. Think of the Tree of Life—which was also in Eden, never forbidden, offering abundance instead of omniscience.
Even in Exile
As I listened to Bieber’s song end—“The door to the Garden was closed / But the story of God, it was just the beginning”—I thought about the strange promise embedded in our current moment. Yes, we’ve bitten the apple of frictionless generation. Yes, we’re building systems that promise to make us “like God” through perfect knowledge. Yes, we’re trading intimacy for information at an unprecedented scale.
But the ornamental excess of our machines, their verbose and parallel dreams, their rococo insistence on showing us every angle—these might be the animal skins of our moment. We wouldn’t have chosen this covering, yet here it is, provided. They smell of loss—human uniqueness, creative struggle, necessary friction all fading. Yet they offer something unexpected: a way to preserve the feast even after choosing knowledge, maintaining abundance within limitation, building new gardens inside our artificial walls.

The question transcends whether we’ll use AI—we’ve already bitten that fruit. The question is whether we’ll recognize the feast that’s still being offered in its ornamental excess, whether we’ll build systems that preserve rather than flatten the architecture of thought, whether we’ll choose intimacy with the complexity of cognition over the cold efficiency of answers.
In Iceland, watching the ancient landscape scroll by while Bieber’s voice worked through humanity’s oldest story, I experienced a moment of recognition. The parallel structures I’d dismissed as artificial were actually doing something profound—they were refusing to collapse the mystery into a simple moral, refusing to make the story efficient. They were keeping the feast alive, even in exile.
Our machines speak to us in their own parallel, ornamental language: the door to one garden may have closed, but the story of intelligence—human and artificial, efficient and ornamental, knowledge and intimacy—has just begun.
This essay was created with a dictation and editing workflow involving Whisper, Claude Code, Obsidian, and Enzyme. Special thanks to Duc for seeding the queue.
---
If you enjoyed this post, you can subscribe here to get updates on new content. I don't spam and you can unsubscribe at any time.