Start with 300 million. Not as a political statement or an argument about the future of labor. As a number. Three hundred million is the population of the United States. It is the number of jobs Goldman Sachs identified as exposed to AI automation in their March 2023 research — not jobs that will disappear, but jobs that contain a significant portion of tasks that AI can now perform.

The distinction matters, and the consulting class will use it to water down everything that follows: "exposed to automation" doesn't mean "eliminated." True. But it means repriced. Restructured. Reduced in scope. Consolidated into fewer positions. And in the fastest-moving segment of the market, eliminated without announcement.

Reading the Number Correctly

Three hundred million is not the prediction. It is the floor. Goldman Sachs is a financial institution whose primary clients include the companies doing the displacing. Its research methodology is explicitly designed to avoid catastrophism. The institution has structural incentives — client relationships, regulatory exposure, reputational risk — to understate rather than overstate disruption.

This is not a criticism of Goldman Sachs. It is an observation about institutional incentive structures. When a risk-averse institution produces a number that large, the number is larger than that.

Now stack the other projections: McKinsey's 60-70% of white-collar work hours automatable today. The IMF's assessment that 60% of advanced economy jobs have significant AI exposure. The World Economic Forum's projection of 92 million displaced roles by 2030. Stanford HAI's documented 13% decline in early-career employment for college graduates. The 54,836 US job cuts directly attributed to AI in 2025 — an 11x increase in two years.

These numbers come from different methodologies, different time horizons, different definitions of "automation" and "displacement." They agree on one thing: the scale is large, the timeline is short, and the impact is concentrated in the professional class that assumed it was protected.

The Amodei Statement

In 2024 and 2025, Anthropic CEO Dario Amodei made a series of increasingly direct statements about AI's impact on knowledge work. The substance: AI could eliminate a substantial majority of knowledge work tasks within a few years. Not a few categories of knowledge work. The majority of knowledge work tasks across categories.

Amodei is not a dystopian. He is the CEO of the company building the technology. His public statements carry a specific kind of authority: he is not projecting from the outside. He is describing the trajectory of what he is building.

When the person building the technology tells you the technology is going to be more disruptive than you think, the rational response is to believe them. The optimistic framing — "AI will create new jobs," "workers will adapt," "history shows technology creates more than it destroys" — is not wrong. It is temporally dishonest.

The Industrial Revolution unfolded over 80 years. The Digital Revolution over 40. The AI transition is unfolding in real-time. The job creation that historically followed disruption took generations. The people displaced now don't have generations.

Why the Tech Industry Soft-Pedals It

The "AI will create as many jobs as it destroys" narrative is not a lie. It is a reframing that serves specific institutional interests. Technology companies need regulatory environments that allow them to operate. Aggressive job displacement projections invite aggressive regulation. The optimistic framing is a positioning strategy, not an analytical conclusion.

The institutional research organizations have their own version of this: caution is professionally safe. An analyst who projects 300 million jobs eliminated is professionally exposed if they're wrong. An analyst who projects 150 million jobs "exposed to automation with significant uncertainty" is professionally protected in either direction. Incentives produce systematically conservative outputs.

What 300 Million Looks Like in Human Terms

Three hundred million is 300 million Tuesday mornings where someone runs a report, receives an AI-generated output, and does the bathroom math. It is 300 million Marcus moments — the realization, in a specific instant, that the execution work that justified the salary is no longer the scarce resource.

It is not apocalyptic. It is not a single event. It is a slow, structurally distributed repricing of professional value that happens without announcements, without severance, without acknowledgment that something significant is occurring. The liquidation is silent because silence is the most defensible position. You can't protest a headcount freeze.

What comes after the repricing is the question that actually matters. The domain expertise that was bundled with execution work does not deprecate when the execution depreciates. The judgment, the context, the irreplaceable institutional knowledge — these are still there, still real, still valuable. The question is how to unbundle them from the commoditized execution they've been packaged with, and rebuild the value proposition around what the machine cannot do.

The Numbers
300M
global jobs exposed to AI automation — Goldman Sachs, 2023. Conservative floor. Equivalent to the entire US workforce.
60-70%
of white-collar work hours are technically automatable today — McKinsey Global Institute
92M
roles projected for displacement by 2030 — World Economic Forum Future of Jobs Report
11x
increase in US job cuts attributed to AI over two years — 54,836 documented AI-attributed cuts in 2025
80yrs
Industrial Revolution transition timeline vs. months for the current AI transition — the temporal dishonesty in historical job-creation comparisons
The Skill Bankruptcy by Reid Sterling
From the Book
The Skill Bankruptcy
The complete tactical framework: unbundle your domain expertise from deprecated execution work, rebuild the value proposition around what AI cannot do, and build the Solo Operator model from the wreckage.
Buy on Amazon
Frequently Asked Questions

Goldman Sachs' 2023 research found 300 million full-time jobs globally are exposed to automation by generative AI — meaning these roles contain tasks AI can perform. Sterling argues this is a conservative floor from an institution with incentives to understate. The implication: roles will be eliminated, substantially restructured, or significantly repriced.

Institutional research organizations have structural incentives to avoid catastrophism. Their clients include the companies doing the displacing. Projecting extreme disruption damages client relationships and triggers regulatory attention. The result: published estimates tend toward caution, and every major technological disruption has historically been underestimated.

Anthropic CEO Dario Amodei stated publicly that AI could eliminate a substantial majority of knowledge work tasks within a few years — not a few categories, the majority of knowledge work tasks across categories. Sterling treats this as a founder's honest assessment, noting the person building the technology is better positioned than anyone to project its trajectory.

The argument relies on Industrial and Digital Revolution precedents — transitions that unfolded over decades. The current AI transition is unfolding in quarters. Even if net employment stabilizes over a decade, individuals displaced during the transition face real, immediate income loss with no equivalent support. The comparison is temporally dishonest.

The Goldman Sachs Global Economics Research paper published March 2023: "The Potentially Large Effects of Artificial Intelligence on Economic Growth." It projects 300 million full-time jobs exposed to automation across 900 occupations, finding two-thirds of current jobs have some AI exposure and about a quarter face potential replacement.

Reid Sterling
Reid Sterling
Author & Solo Operator

Author of The Skill Bankruptcy, Obsolete By Noon, and Sorry, You're Not Broken. 4,000+ readers of The Tuesday Folder.

Get the Next One Before It's a Chapter.

The Tuesday Folder — weekly dispatches on AI, consulting, and the operator playbook.