built in our image

AI products extend the kind of intelligence that lives in text. The kind that holds rooms together has no tooling at all.

Every AI product I can name extends the same kind of intelligence. Vault tools surface connections between ideas. Writing assistants organize output. Research synthesizers find patterns across documents. Code tools autocomplete logic. The input is text. The output is text. The bottleneck being solved is always some version of: I have thoughts and need help structuring them, or I need help generating thoughts and then structuring them.

This works for people whose intelligence lives in text. Internal processors. Abstract connectors. People who accumulate written archives and want to navigate them. It works, in other words, for the people who built the tools.

There’s a kind of intelligence that never enters a text field. The person who tracks that someone in the group doesn’t eat shellfish. The one who notices who hasn’t spoken at dinner and shifts the conversation to draw them back in. The friend who remembers you mentioned a hard week and checks in three days later without being asked. This is real-time social field processing — reading a room, holding a network, anticipating what people need before they’ve named it. The words we have for it make it sound supplementary: emotional labor, people skills, social awareness. Something adjacent to the real thinking happening elsewhere.

No AI product extends this. The entire industry has organized itself around a specific cognitive profile — the kind that processes internally, makes meaning through language, and produces written artifacts — and called it “intelligence augmentation.” Everything else is personality.

Cognitive function theory (the engine underneath Myers-Briggs, if you can get past the internet version) maps this clearly. There are functions that process the internal world and functions that process the social field. Functions that structure through logic and functions that structure through values. Current AI products cluster almost entirely around two: abstract pattern recognition and external organization. These are the functions that make someone productive with text. The functions that make someone productive with people — relational memory, social field awareness, care coordination — have no tooling at all.

The gap isn’t technical. The technical capability to surface patterns across someone’s text messages, calendar, event attendance, and social interactions already exists. The gap is conceptual. We haven’t categorized relational intelligence as intelligence, so we haven’t built tools to augment it. CRMs look like they belong in this space but they reduce relationships to pipeline management. Contact apps store names and numbers. Nothing does for the social field what a vault tool does for an archive of ideas: surface the patterns, reveal the threads, make the implicit visible.

In December, Spotify tells a hundred million people what they listened to all year. Nobody finds this invasive. The data was always there — every play, every skip, every 2am repeat. Wrapped turns it into recognition. Here’s who you were this year. Here’s what you reached for. The portrait is personal and shareable. For a week, everyone curates their own taste, and it cost nothing because the taste was already being generated through the living.

Something like a Wrapped for care would read the data someone is already generating — messages sent, events hosted, people remembered, check-ins made — and reflect it back with the same quality of recognition. You showed up for these people this year. Here’s how you held this group together. Here’s the friend you used to talk to weekly who went quiet in October. Not as optimization. Not as a dashboard. As memory. As evidence that the invisible work was real.

The person who would benefit most from this kind of tool probably wouldn’t describe themselves as needing an AI product. The work they do — holding community together, maintaining relational networks, showing up — feels like who they are, not like a skill to be augmented. There’s a reason for that. Care costs something, and the cost is part of what makes it real. Carrying someone in your mind all week is a different act than being reminded by an app. The person doing the carrying knows the difference.

So the design constraint is specific: the tool has to feel like memory, not strategy. Recognition, not optimization. It has to honor the labor rather than instrumentalize it. And it has to operate on data the user is already generating — behavioral data, relational data, the patterns of care that accumulate in group chats and calendars and the small acts of showing up — without requiring them to become a different kind of thinker first.

We’ve built AI in the image of one kind of mind. The kind that writes things down, processes through text, and values the connection between ideas. A significant portion of human intelligence lives in practice rather than prose. It holds rooms together. It remembers the allergy. It texts after the hard week. It deserves tooling that sees it clearly, built by people who understand that intelligence has always been wider than the archive.


---
If you enjoyed this post, you can subscribe here to get new posts and a weekly marginalia digest. No spam, unsubscribe any time.