The Economic Idea That Explains Why Schools Can't Be Automated

13 March 2026 5 min read

Your hairdresser takes the same time to cut your hair as fifty years ago, but it costs ten times more. An obscure economic theory from the 1960s explains why - and what it tells us about AI in schools.
Your hairdresser takes the same amount of time to cut your hair as they did fifty years ago. No technology has changed that. And yet a haircut costs roughly ten times what it did in 1975. Why?

The answer is one of the most useful ideas in economics, and almost nobody outside academia has heard of it. It's called Baumol's Cost Disease, named after the economist William Baumol, who described it in the 1960s. The idea is simple. When technology makes some industries wildly more productive, the industries it can't touch get more expensive. Not because they got better. Because everything around them got cheaper.

Factories automated. Software scaled. Whole sectors learnt to do more with less. Wages rose across the board. And your hairdresser needed to be paid enough to not leave and go do something else. The same thing happened to teaching, nursing, childcare, social work. Any job where a human being still needs to sit with another human being and do the work in real time.

The Numbers Are Stark

This isn't abstract theory. The Institute for Fiscal Studies reported in 2025 that childcare fees in England rose between 7.6% and 8.7% in a single year. Per-hour funding for three and four-year-olds is 22% lower in real terms than it was in 2017-18. Education spending has dropped from 5.6% of national income in 2010 to 4.1% in 2024. Schools are being asked to do more with less, but the work itself hasn't got any faster. A teacher still needs time to teach. A teaching assistant still needs time to sit with a struggling reader.

Healthcare tells the same story. A 2025 study across 23 OECD countries confirmed that both acute and long-term care costs are driven by Baumol's effect. Long-term care is hit hardest, because you simply cannot substitute a care worker with a machine. Not for the work that actually matters.

For two hundred years, this pattern was fairly predictable. The work that stubbornly needed a human was usually physical. You couldn't download a haircut. You couldn't automate a hug.

Then AI Changed the Question

AI is the first automation technology pushing into territory that used to be protected by its humanness. It's entering education, healthcare, legal advice, therapy, creative work. The jobs most people assumed would always need a person.

UNESCO noted in 2025 that while AI can now handle grading, compile teaching materials, and analyse student performance data, the core work of teaching still resists automation. A professor still needs roughly the same time to mentor a student as they did decades ago. The auxiliary tasks speed up. The human bit doesn't.

So does Baumol's idea still hold? It does. But the line has moved, and it's harder to see.

An AI tutor can probably explain long division better than most humans. It can generate practice questions, mark them instantly, give patient feedback at midnight. But the teacher who notices a child has stopped making eye contact three weeks ago and quietly asks if everything is alright at home - that's not a productivity problem. That's a different kind of work entirely.

Process Versus Judgment

PwC's 2025 AI Jobs Barometer draws a useful line between what they call "highly automatable" work and "highly augmentable" work. Automatable work is the process stuff - scheduling, data entry, summarising, formatting. Augmentable work is where AI supports human expertise and judgment but can't replace it. Both categories are growing, but the augmented roles are growing faster. And in the UK, workers in roles requiring AI skills are already earning an 11% wage premium.

Microsoft Research found something similar. About 80% of workers see individual tasks automated, not their whole role. AI is good at the routine and the predictable. It struggles with situations outside its training data, with common-sense reasoning, with reading the room. Managing people, exercising complex judgment, doing unpredictable physical work - these remain stubbornly human.

This matters for how we think about schools. A school isn't a content delivery system. If it were, AI would have replaced it already. What makes a school a school is the relational work. The pastoral care. The moment a child learns to share, to lose gracefully, to ask for help. None of that scales. None of it speeds up.

The Most Valuable Question

Baumol's Cost Disease used to be about sectors. Manufacturing got productive, so hairdressers got expensive. Now the split is happening inside individual jobs. Parts of a teacher's day are process - taking the register, writing reports, planning lessons from a scheme of work. AI is already making those faster. But the parts that involve reading a child, building trust, making a judgment call about when to push and when to back off - those are getting more valuable precisely because everything around them is getting cheaper.

The question used to be simple: can a machine do this job? Now it's more specific and more important. Which part of this work stubbornly needs a human, even when AI can do everything around it?

For parents, this reframes the anxiety around AI and education. The worry that AI will replace teachers misses the point. The process parts of teaching will probably shrink. But the human parts - the parts that make a school feel like a school - are exactly the work that Baumol predicted would become more expensive and more valued.

Next time your child tells you about their day, listen for which bits involved a screen and which bits involved a person who knew their name. The second category is where the irreplaceable work lives. And if Baumol was right, it's only going to matter more.
Share WhatsApp