“The curriculum is partial against applied subjects”

Lord Jim Knight, a Schools Minister in the last Labour government and now a leading voice on using digital technology in the delivery of public services, spoke to The Blueprint about how the school curriculum should keep up with technology, especially artificial intelligence…

How well do you think the education delivered by schools is meeting the need of employers?

Not very well. Obviously employers have always wanted children to leave school able to read, write, and do maths well and those remain core, as does oracy and, now, digital skills.

I think the moves by the new government around oracy are welcome. We’ll see how they get manifested in the curriculum assessment review.

I’m keen that we move on with digital skills from hiding behind the pretence computer science is the answer to digital skills, to being something that has a wider application and then once you move beyond that, I think the general thrust of the curriculum is overly abstract. It is weighed down with huge amounts of subject knowledge, and it is partial against creative and applied subjects. It is partial towards the purely academic.

When you look at the 2002 legislation, which sets out what should be in the national curriculum, that is pretty balanced. That does include the proper place for design and technology at Key Stage 4. It includes the creative subjects.

But it is the accountability system that then skewed the delivery of the curriculum and the academy system has meant that schools have dropped important subjects like design and technology and deprioritised the creative subjects.

So we now have a curriculum that’s out of balance and which only serves the purely academic route. I’m sure it’s fine for employers who just want to recruit graduates from top universities. But I think it fails everybody else.

What do you think are the key skills that young people now need for work, especially in the fast-growing digital sector and in the realm of artificial intelligence?

I think it’s really important now that all children, whether they’re getting actually into those high tech careers or not, are really strong in basic skills around data analysis. And then if you’re going into those technical worlds, you need to be really strong with data.

They then need to have good critical thinking and creative problem-solving skills, and then you need to have a good sense of how you go about analysing a problem, abstracting a problem where that’s appropriate and in most cases that’s important. Then being able to work out how to calculate the abstract, which is probably quite mathematical. Then they have to use critical thinking skills to reflect on the outcome of the calculation, and whether what they’ve come up with solves the problem. Now. that manifests itself in all sorts of applications. Be it in engineering, in maths, the sciences.

What is your view on how artificial intelligence could affect the delivery of technical education and the needs of industry over the next few years?

Artificial intelligence is as big a shift for the economy and society as the invention of the internet, and it will transform everything. In ways that now are probably inconceivable.

We’re seeing with the traditional AI, with machine learning, pattern recognition, those sorts of things, that it can transform some jobs and it can help with some technical aspects of things and produce the more boring, tedious, work and I think that will continue.

Generative AI then supercharges that and starts to create the opportunity, with a lot of risk attached, for us to have really high-powered assistance, that can help us with calculation, coding, some of the basic functions really well.

As long as we are quick enough to understand where it might be hallucinating, where it might have bias in the data, where the ethical constraints might lie, then the opportunity for us to be able to advance really quickly is profound.

For example, we saw during the rapid development of vaccines that the ability to use simulated artificially intelligent environments to rapidly work through the possibilities on vaccine development enabled us to accelerate that technology really rapidly and obviously to the world’s benefit.

I think we will see more of that and we’ll certainly need to make sure that we are skilled for it, we have the computing power for it, and we have the investment for it with, with the ethical safeguards that we need.

How do you think it could affect those school leavers who have started off in different sectors and probably doing those menial tasks that are going to be replaced by AI?

The danger is that if you are an employer, you might think it’s better to deploy machinery rather than humans to do the basic tasks and then believe that you can find the supervisory talent to monitor those machines and to manage the system and to do the higher level thinking, without thinking through how we develop the future middle level talent to do that human function.

So we’re going to have to design carefully the right incentives so that we have got entry level employment for people who can build up the experience to become the managers of the machines that we will then be deploying across the economy.

What do you think can be done by educators, the government and employers to bridge the gap between what skills and knowledge being taught in schools and what industry needs.

I think within schools, we need to do some urgent work to develop strategies around using AI for teaching. That in turn means building the competence of teachers around AI and seeing it as something that might be their friend and not their enemy.

We then need to ensure that we have some of the computing power that is needed in order to properly deploy this within schools and that we have young people starting to be trained to think around about how to deploy it, how to adapt it.

We have got some great open source models now. For example, Meta’s Llama model is open source. There are others, similarly, who are publishing AI models that people can play with, can tinker with, and start to do interesting things with.

Then, instead of schools worrying that children will be cheating by using AI models, schools should be thinking about the future that they’re going to have, where people going to be collaborating with those machines, and embedding within the way we teach and learn the collaboration of those machines, to reflect the world of work that students are going to be moving into.

This article was first published in The Blueprint, Baker Dearing’s newsletter for external stakeholders. To receive future editions of The Blueprint, register here.

“The curriculum is partial against applied subjects”

Lord Jim Knight, a Schools Minister in the last Labour government and now a leading voice on using digital technology in the delivery of public services, spoke to The Blueprint about how the school curriculum should keep up with technology, especially artificial intelligence…

How well do you think the education delivered by schools is meeting the need of employers?

Not very well. Obviously employers have always wanted children to leave school able to read, write, and do maths well and those remain core, as does oracy and, now, digital skills.

I think the moves by the new government around oracy are welcome. We’ll see how they get manifested in the curriculum assessment review.

I’m keen that we move on with digital skills from hiding behind the pretence computer science is the answer to digital skills, to being something that has a wider application and then once you move beyond that, I think the general thrust of the curriculum is overly abstract. It is weighed down with huge amounts of subject knowledge, and it is partial against creative and applied subjects. It is partial towards the purely academic.

When you look at the 2002 legislation, which sets out what should be in the national curriculum, that is pretty balanced. That does include the proper place for design and technology at Key Stage 4. It includes the creative subjects.

But it is the accountability system that then skewed the delivery of the curriculum and the academy system has meant that schools have dropped important subjects like design and technology and deprioritised the creative subjects.

So we now have a curriculum that’s out of balance and which only serves the purely academic route. I’m sure it’s fine for employers who just want to recruit graduates from top universities. But I think it fails everybody else.

What do you think are the key skills that young people now need for work, especially in the fast-growing digital sector and in the realm of artificial intelligence?

I think it’s really important now that all children, whether they’re getting actually into those high tech careers or not, are really strong in basic skills around data analysis. And then if you’re going into those technical worlds, you need to be really strong with data.

They then need to have good critical thinking and creative problem-solving skills, and then you need to have a good sense of how you go about analysing a problem, abstracting a problem where that’s appropriate and in most cases that’s important. Then being able to work out how to calculate the abstract, which is probably quite mathematical. Then they have to use critical thinking skills to reflect on the outcome of the calculation, and whether what they’ve come up with solves the problem. Now. that manifests itself in all sorts of applications. Be it in engineering, in maths, the sciences.

What is your view on how artificial intelligence could affect the delivery of technical education and the needs of industry over the next few years?

Artificial intelligence is as big a shift for the economy and society as the invention of the internet, and it will transform everything. In ways that now are probably inconceivable.

We’re seeing with the traditional AI, with machine learning, pattern recognition, those sorts of things, that it can transform some jobs and it can help with some technical aspects of things and produce the more boring, tedious, work and I think that will continue.

Generative AI then supercharges that and starts to create the opportunity, with a lot of risk attached, for us to have really high-powered assistance, that can help us with calculation, coding, some of the basic functions really well.

As long as we are quick enough to understand where it might be hallucinating, where it might have bias in the data, where the ethical constraints might lie, then the opportunity for us to be able to advance really quickly is profound.

For example, we saw during the rapid development of vaccines that the ability to use simulated artificially intelligent environments to rapidly work through the possibilities on vaccine development enabled us to accelerate that technology really rapidly and obviously to the world’s benefit.

I think we will see more of that and we’ll certainly need to make sure that we are skilled for it, we have the computing power for it, and we have the investment for it with, with the ethical safeguards that we need.

How do you think it could affect those school leavers who have started off in different sectors and probably doing those menial tasks that are going to be replaced by AI?

The danger is that if you are an employer, you might think it’s better to deploy machinery rather than humans to do the basic tasks and then believe that you can find the supervisory talent to monitor those machines and to manage the system and to do the higher level thinking, without thinking through how we develop the future middle level talent to do that human function.

So we’re going to have to design carefully the right incentives so that we have got entry level employment for people who can build up the experience to become the managers of the machines that we will then be deploying across the economy.

What do you think can be done by educators, the government and employers to bridge the gap between what skills and knowledge being taught in schools and what industry needs.

I think within schools, we need to do some urgent work to develop strategies around using AI for teaching. That in turn means building the competence of teachers around AI and seeing it as something that might be their friend and not their enemy.

We then need to ensure that we have some of the computing power that is needed in order to properly deploy this within schools and that we have young people starting to be trained to think around about how to deploy it, how to adapt it.

We have got some great open source models now. For example, Meta’s Llama model is open source. There are others, similarly, who are publishing AI models that people can play with, can tinker with, and start to do interesting things with.

Then, instead of schools worrying that children will be cheating by using AI models, schools should be thinking about the future that they’re going to have, where people going to be collaborating with those machines, and embedding within the way we teach and learn the collaboration of those machines, to reflect the world of work that students are going to be moving into.

This article was first published in The Blueprint, Baker Dearing’s newsletter for external stakeholders. To receive future editions of The Blueprint, register here.

Latest News