How to improve the chances of poor children at school

Measuring progress on tackling the impact of poverty. Dominic Lipinski/PA Archive

Overcoming educational disadvantage is not easy. Even before they enter the school gates for the first time, a variety of factors including inheritance, social class, parenting style and family income are related to children’s capacity to learn in formal education.

On average, children who are entitled to free school meals start school in England with lower scores in reading and maths than their peers. And this trend persists to the end of primary school. This poverty gap may even widen with age. At 16, the gap between children eligible for free school meals and others achieving five or more high passes in GCSE or equivalent, has widened to 26%.

Little progress

Recent attempts to address this issue do not seem to have been successful, as we set out with colleagues in a new collection of research in collaboration with the Demos think tank. Between 2000 and 2007, the differences in performance between children entitled to free school meals and their peers has remained consistent for both maths and English.

The lack of progress reminds us that this is not only an educational issue. Schools and teachers cannot be held solely responsible for addressing the effects of income inequality or other social problems. Nor can we blame neglectful parents, or lack of aspiration among young people. It is at least partly a political issue, with political and economic solutions. Until it is solved, schools and educators must seek to address the educational inequality that currently exists, to enable their pupils to have as fair a chance of educational success as possible.

Where’s the evidence?

A recent reassuring trend has been the cross-party consensus about the need to create and use evidence in education policy and practice. A number of initiatives and institutions have been launched to support this work, such as the What Works centres – the Early Intervention Foundation and Education Endowment Foundation (EEF).

The Teaching and Learning Toolkit produced by the EEF and the Sutton Trust is one example of what has emerged. It provides accessible evidence on which educational interventions have been tested and shown to be effective in improving attainment, and has shown roughly how much these cost. This is now consulted by about half of the schools in England when deciding how to spend their pupil premium funding – additional money given to schools for each child entitled to free school meals.

Not such a Sure Start

But knowing what has been found to work in the past, and in well-funded and controlled trials, is only part of the solution. For example, the early education initiative Sure Start was based on evidence indicating that, on average, early years interventions have tended to be successful in improving educational outcomes for disadvantaged children.

But Sure Start has not been successful in closing the poverty gap for young children. One reason is that it can be difficult to roll out initiatives at a larger scale, often with less funding per person. When projects like this expand, the impact can dissipate as the intervention involves “conscripted” participants rather than just the volunteers taking part in the initial trial.

It also may be that the typical Sure Start intervention did not have the most important ingredients which produced the effects found in the initial research. Or it is possible that those who designed the policy looked at the most successful examples of early years intervention and assumed – or hoped – that these would simply be replicated.

Good bets, but not the only answer

Although some ideas are typically effective – such as providing feedback to pupils or developing their skills and confidence in planning, monitoring and evaluating their own learning – they do not always have these effects. They are “good bets” on average, but they also have a wide spread of impact, including negative, harmful results in some cases.

More than 90 large-scale randomised trials commissioned by the EEF should add to the evidence in the toolkit and help us understand how to scale-up successful interventions. We should not underestimate the challenge here. It will be important to try to improve things which are less successful, as well as replicate those which have been shown to work.

A recent example concerns the contribution of teaching assistants. The evidence a few years ago indicated that, on average, they made very little difference to the attainment of the pupils in the classes they supported. Recent evidence from trials, including two funded by the EEF, showed that where teaching assistants are trained and supported to provide intensive support to pupils in small groups or one-to-one, pupils can make an additional three months progress in reading or mathematics. If this knowledge could be applied across the country, the benefit would be considerable.

Evidence of what has worked in other contexts is a necessary condition for reform. Our best guesses are not good enough. But such evidence on its own is still not good enough. Those in charge of policy still tend to cherry-pick evidence to suit an agenda.

We must not get into the trap of generalising from success stories alone: we need to take a cold hard look at the evidence of what has and has not been successful. We need to apply this critically and rigorously across the system, evaluating as initiatives are scaled-up to ensure that we are successful in improving educational outcomes for those in our schools currently disadvantaged by their economic circumstances.