THE HUMAN FACTOR
The AI industry calls the human the bottleneck. There's a better term.
About a year ago, I was advising a startup that allowed consultants to build a professional AI twin, trained on their expertise, so that clients could interact with it on demand.
I approached a consultant I admire. We had one of those long-distance intellectual relationships that Substack makes possible: reading each other’s work, respecting each other’s thinking, never having met. I invited him to build his.
He pushed back. His main value, he told me, is not really his knowledge. What corporate clients will always value, he wrote, is trust, relationship, experience, social proof, communication skills and empathy. The knowledge itself, people can get pretty much for free. “If it’s to acquire knowledge, people are better off asking ChatGPT.”
He even did the math. A $20 ChatGPT subscription versus a $99 to $499 twin built on his expertise. Why would anyone pay the difference for knowledge alone?
At the time, I was shocked. If even accomplished experts do not believe they have something the machine lacks, we are facing something worse than job displacement, I thought. A species-level crisis of self-confidence.
Rereading our exchange now, I see what he meant. He was not lacking confidence. He was telling me that his value cannot be separated from who he is. The knowledge could go into a twin. The rest of him could not. And the rest of him is where the value lives.
I have been exploring for the past year what humans should retain in the new division of labor with AI. What he was describing is at the heart of what we need to keep, and what makes us valuable.
If disembodied human knowledge loses on price, and as I wrote recently AI’s subsidized price makes that loss feel absolute, the question is what holds its value. He was not trying to compete with the LLMs on knowledge. He was pointing to something they cannot touch. The pattern shows up well beyond one consultant’s decision.
In 2016, Geoffrey Hinton predicted AI would replace radiologists. A decade later, despite advances in automating scan reading, there are more radiologists than ever, earning more than ever. The job turned out to be much more than the task AI could do. It was triaging, communicating with physicians, training residents, making the difficult calls that other clinicians trust enough to act on. Each of those is a bottleneck: a point where the human is essential to the process working.
Recent research has a term for why the radiologist survived. A strong bundle.¹ Think of a job not as a list of tasks but as a bundle of them. Some bundles are weak: the tasks do not depend on each other, so they can be pulled apart without losing value. Data entry, routine summarization, single-function work. But a strong bundle is different. The tasks are interdependent. You need the same person connecting all of them. And automating one task within a strong bundle does not eliminate the human. It makes the remaining human work more valuable.³
The most vulnerable workers are the most narrow. Single-task, single-skill, already being unbundled.² As I wrote in my last post, Smith warned 250 years ago what happens when you reduce a person to a single narrow operation: you strip them of the capacity for judgment, conversation, feeling.⁴ As they are dehumanized, they lose their human value. And what has lost its human value is what gets automated. AI is a sorting function on human capability. It rewards complexity and punishes narrowness.
This pattern plays out at larger scales too. In London, AI can draft a planning review, but it cannot convince the environmental group to drop its lawsuit, persuade the politicians, or negotiate with the neighbors.⁵ The bottleneck is human. The beneficiary is human. The friction is the necessary work of moving complex things forward in a human world. When the decisions have real consequences, health, contested negotiations, large-scale outcomes, we trust humans. Why Judgment Is Ours
The consultant I approached is exactly this kind of strong bundle. The more of your human capabilities you bring to the work, the stronger the bundle becomes. This is not something that happens automatically. The tendency in most organizations is the opposite: to narrow roles, assign single tasks, optimize for efficiency, as Adam Smith so famously laid out. It feels like the right move in the short run. But it is exactly what makes people more exposed. The work is to resist the narrowing and cultivate what makes you central to the equation.
AI does not reduce the need for humans. It concentrates the value at the human bottleneck.
In a framework built purely for throughput, a bottleneck is something to be removed. That is the Silicon Valley narrative: the human slows the system down. But a bottleneck is only a flaw in a system optimized for efficiency. In a system that depends on judgment, trust, and coordination, it is the point of highest value. Everything passes through us. In engineering, they have a term for the principle of designing systems around that rather than against it. The human factor. Not a flaw in the system. The reason the system works, if we stay complex enough to hold that role.
¹ Garicano, Li and Wu, "Weak Bundle, Strong Bundle: How AI Redraws Job Boundaries," March 2026. ² Frey and Osborne, "The Future of Employment," Oxford University. ³ Gans and Goldfarb, "O-Ring Automation," NBER Working Paper 34639, 2026. ⁴ Smith, The Wealth of Nations, Book V, Ch. 1. ⁵ Garicano, via Business Today, February 2026.

