The Calculator Moment
In the 1970s, schools banned calculators. They didn't destroy maths - they changed what maths meant. We're in that same moment with AI, except this time it's everything.
In the 1970s, when electronic calculators became cheap enough to appear in classrooms, the reaction was immediate. Schools banned them. Parents protested them. Teachers warned they would produce a generation that could not think mathematically. A 1975 survey found 72% of teachers and parents disapproved of calculators in schools. The debate consumed education policy for over a decade.
We know how it ended. Calculators did not destroy mathematical thinking. They changed what mathematical thinking meant. A child no longer needed to spend twenty minutes on long division. They could spend that time on the concepts long division was supposed to serve - proportional reasoning, algebraic thinking, problem decomposition. The tool freed the learner from the mechanical to engage with the meaningful.
But here's the part of that story that gets left out.
The parents who said calculators would make children stupid were wrong. But parents who might have said "just give them calculators and skip the maths" would have been wrong too. The answer turned out to be both. Build the foundation, then give them the tool. A UK parliamentary debate in 2011 - decades after the panic - still came back to the same consensus: children need quick recall of number facts and consistent methods of calculation before calculators enter the picture.
We are in that calculator moment again. Except this time it is not just arithmetic. It is reading, writing, research, analysis, coding, creative work, communication, problem solving. Every cognitive task that AI can now perform competently.
Meanwhile, usage is surging regardless. A 2025 HEPI survey of UK undergraduates found 92% now use AI tools, up from 66% just a year earlier. 88% use AI for assessment preparation. The genie is not going back in the bottle.
So the question is not whether children will use these tools. They will. The question is whether they will have the foundation to use them well.
A 2025 systematic review confirmed it from the other direction. Large language models are highly sensitive to how you phrase things. The vocabulary, the structure, the specificity of what you ask - all of it changes the output. Creating a useful prompt about, say, the water cycle requires that you already know something about the water cycle. The tool amplifies what you bring to it.
The Raspberry Pi Foundation made the same argument about coding in 2025. AI can generate code, but it still takes a skilled person to know whether that code is good, safe, and doing what it should. Learning the basics builds what they call "epistemic agency" - the ability to ask how and why a system works, rather than just accepting what it produces.
A child who has spent time writing by hand, working through problems on paper, reading physical books - that child is building something AI cannot provide. They are building the internal model that lets them recognise when an AI answer is wrong, when it has missed the point, when it is confidently making something up. Without that foundation, AI is not a tool. It is an oracle, and you have no way of knowing whether the oracle is lying.
This is the thing the calculator debate eventually taught us. The tool is not the enemy. Skipping the foundation is. A child who understands multiplication can use a calculator to do more interesting maths. A child who understands how to structure an argument can use AI to research and draft more ambitious essays. The foundation does not become less important when the tools get better. It becomes more important, because the gap between what the tool can do and what you can verify widens.
The curriculum was designed for a 20th century industrial economy. That economy will not exist when today's primary school children reach adulthood. But the answer is not to throw out the foundations and hand over the tools. The answer, as it was with calculators, is to make sure the foundations come first - and then teach children to use the most powerful tools any generation has ever had access to. The parents who get this right will be the ones who insist on both.
We know how it ended. Calculators did not destroy mathematical thinking. They changed what mathematical thinking meant. A child no longer needed to spend twenty minutes on long division. They could spend that time on the concepts long division was supposed to serve - proportional reasoning, algebraic thinking, problem decomposition. The tool freed the learner from the mechanical to engage with the meaningful.
But here's the part of that story that gets left out.
The Foundation Came First
The transition worked because children still learnt the mechanics before they picked up the calculator. They understood what the machine was doing. They could estimate whether an answer was reasonable. They could catch errors. They had the foundation, and the tool extended it.The parents who said calculators would make children stupid were wrong. But parents who might have said "just give them calculators and skip the maths" would have been wrong too. The answer turned out to be both. Build the foundation, then give them the tool. A UK parliamentary debate in 2011 - decades after the panic - still came back to the same consensus: children need quick recall of number facts and consistent methods of calculation before calculators enter the picture.
We are in that calculator moment again. Except this time it is not just arithmetic. It is reading, writing, research, analysis, coding, creative work, communication, problem solving. Every cognitive task that AI can now perform competently.
The Same Panic, Wider Scope
The responses from schools look familiar. Some have banned AI tools outright. Others are cautiously experimenting. The Department for Education's guidance takes a middle path - it does not ban AI but requires human oversight for all use in schools and warns that most popular AI tools are restricted to users aged 18 and over.Meanwhile, usage is surging regardless. A 2025 HEPI survey of UK undergraduates found 92% now use AI tools, up from 66% just a year earlier. 88% use AI for assessment preparation. The genie is not going back in the bottle.
So the question is not whether children will use these tools. They will. The question is whether they will have the foundation to use them well.
You Can Only Spot a Bad Answer If You Know What a Good One Looks Like
This is where the calculator analogy holds up surprisingly well. Research on AI and prompt engineering consistently finds the same thing: the quality of what you get out depends on the quality of what you put in. A 2024 paper in Frontiers in Education established that effective use of AI tools requires foundational competencies - comprehension, critical reasoning, and domain knowledge. Students without that foundation produce significantly weaker results when working with AI.A 2025 systematic review confirmed it from the other direction. Large language models are highly sensitive to how you phrase things. The vocabulary, the structure, the specificity of what you ask - all of it changes the output. Creating a useful prompt about, say, the water cycle requires that you already know something about the water cycle. The tool amplifies what you bring to it.
The Raspberry Pi Foundation made the same argument about coding in 2025. AI can generate code, but it still takes a skilled person to know whether that code is good, safe, and doing what it should. Learning the basics builds what they call "epistemic agency" - the ability to ask how and why a system works, rather than just accepting what it produces.
Pencils Before Prompts
None of this means AI should be kept out of children's lives. That ship has sailed. But it does mean the order matters.A child who has spent time writing by hand, working through problems on paper, reading physical books - that child is building something AI cannot provide. They are building the internal model that lets them recognise when an AI answer is wrong, when it has missed the point, when it is confidently making something up. Without that foundation, AI is not a tool. It is an oracle, and you have no way of knowing whether the oracle is lying.
This is the thing the calculator debate eventually taught us. The tool is not the enemy. Skipping the foundation is. A child who understands multiplication can use a calculator to do more interesting maths. A child who understands how to structure an argument can use AI to research and draft more ambitious essays. The foundation does not become less important when the tools get better. It becomes more important, because the gap between what the tool can do and what you can verify widens.
The curriculum was designed for a 20th century industrial economy. That economy will not exist when today's primary school children reach adulthood. But the answer is not to throw out the foundations and hand over the tools. The answer, as it was with calculators, is to make sure the foundations come first - and then teach children to use the most powerful tools any generation has ever had access to. The parents who get this right will be the ones who insist on both.
Sources & Further Reading