Education is one of the largest academic research enterprises in England, and yet in over 50 years, research into education has failed to find useful answers to many of the most basic questions about how we teach children.
The department of education, which recently published a proposed set of research priorities for schooling in England, is attempting to close the evidence gap. While this is a welcome endeavour, it raises questions about how far politicians should set the academic agenda.
There is still no definitive answer to whether any one type of school is better than any other, how long a school day should be, whether it matters who goes to school with who, or homework makes sense, or class size makes a difference.
Last year, it was thought that teaching assistants were an expensive waste of space; now there are indications that TAs can be deployed usefully. The situation is even worse than epidemiology, with its vanishing breakthroughs, and contradictory dietary advice.
Recently, I have conducted wide-ranging reviews of parental involvement and expectations of education, and of student aspirations, attitudes and behaviour. In answer to the question whether any of these factors lead to improved attainment at school, research is largely silent.
After examining nearly 200,000 research reports, we didn’t find any that used rigorous evaluation designs, such as randomised controlled trials, to see if changing expectations or encouraging involvement from parents in their children’s education led to changes in attainment. Despite 50 or more years of publicly-funded education research there is still a lot to find out.
What’s the priority?
The department of education’s new research priorities include questions covering the curriculum, assessment, qualifications, and pupil participation. Questions range from how ready are schools for the new national curriculum, to how can girls’ take-up of science, technology, engineering and maths subjects at A-level be increased?
The proposals call for evidence of what works, largely based on measurement, and for the evidence to be engineered into a format that practitioners can read and use where appropriate. The proposals are also somewhat collaborative, inviting commentary and views in open consultation. In general, this is an excellent idea – long overdue.
In practice, of course, how well it works depends upon a number of factors, including whether it survives the next general election. Probably the two most important issues are whether the research priorities are well-judged and whether the quality of evidence sought and accepted is sufficient.
It’s worth examining two of the government’s 14 research priority documents here – on academies and on the pupil premium, the policy which pays extra money to schools for students from disadvantaged backgrounds.
The document on academies takes too much for granted. It merely assumes that academies are superior to other kinds of schools, and so distorts the research agenda into extended political activism.
It involves leading questions such as: “How are academies contributing to a wider system of school improvement?”, and “How do academy trusts use their new responsibilities and freedoms to improve the governance of academies?”.
This is typical of nearly all 14 purportedly consultative documents. Sometimes the unwarranted initial assumptions are multi-faceted, such as in the question “How can Dioceses best add value …?”.
Never mind unproven assumptions about the superiority of academies and any added value provided by faith-based education, this merely assumes that religious sectarianism is beneficial to learners. It is more likely to be harmful; but either way surely we should find out.
The research document on the government’s pupil premium policy has more genuine and relevant research questions – perhaps because the policy here is more recent. These include how schools can spend the premium money most effectively and how to judge that effect, if there is any.
Another reason could be that the Educational Endowment Foundation (EEF) has received public money to fund rigorous evaluations that address these questions, and the findings will slowly transform the charity’s pupil premium toolkit.
These are very promising developments, and it is to be hoped that the EEF can focus solely on uncovering the best bets about what works, and will be permitted by commentators to push political and image considerations to one side.
The end product should be a kind of a la carte menu for practitioners showing a wide range of possible targeted actions, their cost, likely impact and an assessment of the strength of the evidence for each. This could permit the best kind of evidence-informed practice, where the best evidence is filtered into use in schools via the professional and local judgement of teachers and school leaders.
Will the evidence be suitable?
Education matters, therefore research should matter. It is time to move beyond the mere perceptions of those involved (the largely fatuous target of most current research), and drop the complex statistical dredging that no-one understands (the equally pointless approach of much of the rest).
Instead, we need better designed studies which will lead to clearer results, assessing practical issues, so that in another 50 years the situation will be transformed. The publishing of the department of education’s research priorities is a welcome step towards that, and its general approach document is brief but eminently sensible.
But even if the department of education did not want to go as far as to ask “Do academies work?” for example, they could have asked: “Are we now sure that academies offer an advantage over other types of schools?”. Unfortunately, some kind of political lens seems to have disrupted this “what works” manifesto, to produce a set of generally less urgent research questions in many areas.