Nobody sent a memo. That's the part that makes this particular economic atrocity so elegant, so perfectly designed to avoid accountability. Entry-level job postings dropped 45% in a single quarter. Not moved overseas. Not restructured into new titles with slightly different responsibilities. Gone — as in, the decision was made at the budget level that these positions no longer justified their cost, and nobody held a funeral because you don't hold funerals for roles that were never posted.
The liquidation is silent because silence is the most defensible position. You can't protest a job that wasn't advertised. You can't organize against a headcount freeze. You can't identify the villain when the villain is a line item in a spreadsheet that nobody put their name on.
Section 1: The Data That Changes Everything
Here are the numbers. Not the soft-pedaled, "AI will create as many jobs as it destroys" version. The actual numbers, from the institutions that originally produced them:
- 300 million: global jobs exposed to AI automation — Goldman Sachs, 2023
- 60-70%: white-collar work hours technically automatable today — McKinsey Global Institute
- 60%: advanced economy jobs with significant AI exposure — IMF, 2024
- 92 million: roles projected for displacement by 2030 — World Economic Forum
- 45%: drop in entry-level job postings, Q1 2025 — confirmed across multiple labor market datasets
- 54,836: US job cuts directly attributed to AI in 2025 — an 11x increase in two years
- 13%: decline in early-career employment for college graduates — Stanford HAI
- 4,000: Salesforce customer support positions eliminated in a single round
- 200,000: banking sector positions at risk by 2030 — industry analysts
The Goldman Sachs 300 million figure is not a worst-case scenario. It is a conservative estimate from one of the most risk-averse research institutions in the financial world. When Goldman Sachs says 300 million, they are describing the floor, not the ceiling.
The framing that AI will "create as many jobs as it destroys" rests on historical precedent — specifically, the Industrial Revolution and the Digital Revolution. But those transitions took decades. Workers had time to retrain, move, adapt. The current transition is measured in quarters, not generations. The comparison to previous technological disruptions is not wrong — it is temporally dishonest.
Section 2: The Tuesday Morning
Consider Marcus. Twenty-eight years old. Financial analyst at a mid-size asset management firm. Did everything right: the finance degree, the CFA Level I, the three internships, the two years of grinding through entry-level Excel hell to build toward the modeling work.
On a Tuesday morning in March, Marcus ran a standard portfolio analysis report. The version he had been running manually — the one that took him six to eight hours and produced a deliverable that justified his existence to his manager — he ran it through the new AI analysis layer the firm had quietly deployed. The report came back in four minutes and twenty-two seconds. It was correct. It was cleaner than his version. It included three insights he had missed.
Marcus did the math in the bathroom. Not because there was urgency, but because he needed to be alone while he did it. The math was simple: his primary value proposition — the hours of analytical work he traded for his salary — had just been repriced. Not eliminated from the workflow. Repriced. From $85,000 per year to $35 per month in API costs.
Marcus's crisis is not unique. It is the template. The specific role varies — analyst, associate, coordinator, junior anything. The structure is identical: a person who did everything they were told to do, who executed at the level they were trained to execute, discovering in a single moment that the execution itself is no longer the scarce resource.
This is what Reid Sterling calls the Skill Bankruptcy. Not a firing. Not a layoff. A repricing event — silent, structural, and uncontestable.
Section 3: Why the Standard Advice Is a Toothpick at a Drone Strike
The standard career advice for the AI economy has been consistent, sincere, and almost entirely useless. Let's go through it with the precision it deserves.
"Learn to prompt." Prompt engineering was a differentiator for approximately fourteen months in 2022 and early 2023. It is now table stakes — built into every AI interface, automated in most enterprise deployments, and taught in two-hour workshops to people who still call spreadsheets "the Excel." Recommending prompt engineering as a career strategy in 2026 is like recommending that displaced factory workers learn to use email.
"Upskill and reskill." To what? Every knowledge-work field is being hit simultaneously. The assumption behind reskilling advice is that there is a safe harbor — a role or industry that AI cannot touch — and you should move toward it. That harbor doesn't exist. The advice assumes a specific structure of disruption that doesn't match the actual data.
"Build your personal brand." Some people genuinely thrive as content creators. Most don't. The personal brand advice works for a specific type of person in a specific type of role with a specific type of risk tolerance. Prescribing it broadly to people facing real income insecurity is offering a lottery ticket as a financial plan.
"Network harder." If your network is composed primarily of people in the same profession you're trying to leave, networking is a support group with LinkedIn Premium. Knowing more people inside a shrinking category does not expand the category. It expands your awareness of other people who are also watching the category shrink.
Section 4: The Real Diagnosis — AIRD
What Reid Sterling calls AIRD — Artificial Intelligence Replacement Dysfunction — is not a personal failing. It is a rational response pattern to an irrational economic transition. The symptoms are: professional identity collapse, anticipatory anxiety about career viability, avoidance of job market signals (not checking LinkedIn because it feels like pressing a bruise), and performance of normalcy while privately running displacement calculations.
AIRD is not weakness. AIRD is what happens when you are responding rationally to an economic violence that has been deliberately designed to feel like personal failure.
The gap between what the credential promised and what the market now pays is where the betrayal lives. And it is a betrayal — of a social contract that said: do the right things in the right order, and the system will reward you appropriately. That contract has been broken. Not by the people who believed it. By the people who knew it was breaking and kept collecting tuition.
AIRD is not irrational. AIRD is what happens when you discover that the scorecard you've been optimizing for your entire professional life has been quietly replaced with a different game, and nobody told you, and you found out in a bathroom running mental math on a Tuesday.
Section 5: The Three-Sentence Way Through
This is not a section about toxic positivity. It is not about "the silver lining" or "how every disruption creates opportunity." Those framings are true in aggregate and useless for individuals who need a specific tactical answer.
Here is the tactical answer, in three sentences:
Your execution skills are dead. The ability to run the analysis, write the brief, produce the model, draft the contract, process the application — these are no longer scarce. Accept this. The grief is legitimate. Take the time to grieve it. Then move.
Your domain expertise is worth more than it has ever been. The AI cannot know why the software should be built. It cannot know which supplier is reliable in Q1 and unreliable in Q4. It cannot know which regulatory interpretation has teeth and which is theater. It cannot know what your client actually needs versus what they think they want. That knowledge is inside you, and the market has not repriced it — it has underpriced it, because until recently it was bundled with execution work that has now been commoditized.
The gap between those two facts is where the opportunity lives. The Domain Translator — someone who possesses irreplaceable knowledge of a specific field and can direct AI execution within it — is the role that has actually increased in value as everything else has been automated. The question is not "how do I survive the AI economy?" It is: "which part of what I know cannot be replicated by a model trained on publicly available data?" That part is the business.
Goldman Sachs projects 300 million jobs globally are exposed to AI automation. McKinsey estimates 60-70% of white-collar work hours are technically automatable today. The IMF estimates 60% of advanced economy jobs have significant AI exposure. The WEF projects 92 million roles will be displaced by 2030. Entry-level job postings dropped 45% in Q1 2025.
Sterling's argument in The Skill Bankruptcy is that these numbers are conservative — produced by institutions with structural incentives to understate disruption. The actual floor is higher than any of these projections.
Contrary to popular narrative, AI disproportionately targets white-collar knowledge work. Customer support, financial services, legal research, HR administration, data analysis, and entry-level consulting are all undergoing rapid automation. Any task involving gathering, processing, and presenting information is automatable. What is not automatable is the domain judgment that decides what information matters and why.
AIRD — Artificial Intelligence Replacement Dysfunction — is Reid Sterling's term for the psychological response pattern experienced by professionals facing AI displacement: professional identity collapse, anticipatory anxiety, avoidance of job market signals, and learned helplessness in the face of skill deprecation.
Sterling's argument is that AIRD is a rational response to an irrational economic transition — not a personal failure. The people experiencing it are not weak. They are responding correctly to a system that changed the rules mid-game and blamed them for not adapting faster.
No — this is the counterintuitive reality most analysis gets backwards. AI excels at processing, analyzing, and generating text-based information — the primary output of white-collar knowledge work. Physical tasks requiring real-world manipulation and spatial reasoning in variable environments are actually harder to automate.
Stanford HAI data showing a 13% early-career employment decline specifically targets college-educated entry-level workers — the demographic most confident they were safe.
Sterling's The Skill Bankruptcy identifies the only survivable position as: domain expertise combined with the ability to architect AI systems. The Domain Translator — someone who possesses irreplaceable knowledge of a specific field and can direct AI execution within that field — is the role that has actually increased in value.
Prompt engineering and general AI literacy are table stakes. The irreplaceable asset is domain judgment: knowing which supplier is reliable in Q1 but a mess in Q4, understanding which regulatory interpretation has teeth, knowing why the software is being built.