We often say we want education to be evidence-informed. Quite right too: schools want to invest in approaches that genuinely help pupils learn. However, anyone who looks closely at how education research is produced quickly discovers how slow that process really is. That is not a secret, but a recent study shows with painful clarity just how long the delay can be in practice.
Taylor and colleagues reviewed 376 American impact studies, all of which were funded by the Institute of Education Sciences. Their question was simple: how long does it take before a school can reliably use the findings of an intervention? The answer: about eight years. Let me repeat that: eight years.
Not because researchers are doing nothing, but because every link in the chain takes time. Four years to run the study. At least another year to publish it. And then, on average, more than two and a half years for an independent quality check by a clearinghouse such as the What Works Clearinghouse (WWC).
These are US figures, but the underlying dynamics are not uniquely American, even though the WWC is known for being exceptionally rigorous and strict. The research process in many countries is not much faster. Educational interventions require time to implement. Publication systems are slow and selective. Replications are rare.
Education does not lend itself to fast science
The slowness is not a sign of inefficiency. It is a consequence of how education and research work. An intervention that you want to evaluate fairly must run in an authentic school context. With teachers who have full classes, timetables that do not always cooperate, and pupils who fall ill, move house or switch schools. You cannot complete a cleanly defined experiment in two months, as might be possible in other scientific fields.
And even if an intervention has been carefully studied, one study is never enough. Education is sensitive to context. What works in a school in Texas does not automatically work in a school in Finland or Singapore, which may have different curricula, pupil populations and teaching practices. That is why we need replications. Syntheses. Meta-analyses. And that pushes the horizon further and further away.
The real bottleneck lies between research and practice
What Taylor and colleagues demonstrate most clearly is the thinness of the bridge between research and practice. Researchers do their work. Schools have questions. And between the two lies a stretch of no man’s land where a great deal of knowledge gets stuck. Not because the research is wrong, but because there is no structure in place to collect it, assess it and translate it.
In England, that role is played by the Education Endowment Foundation, while in the United States, the What Works Clearinghouse is central to this goal. In many countries, such infrastructure does not exist, even though the need is clear: someone must help schools make sense of evidence, understand the conditions under which it is effective, and determine for whom it works best.
Practice always moves faster than science
Another tension often goes unnoticed. Research in education moves slowly, but schools must make decisions every day. And suppliers of programmes, tools and methods do not have to wait eight years. That makes education vulnerable to strong stories and quick fixes. Not because schools are naïve, but because they must act in real time.
That is why a robust knowledge infrastructure is crucial. Not to tell schools what to do, but to lower the noise level and make reliable information available more quickly.
Taylor and colleagues point to several promising ways forward, including shorter cycles of testing and adjustment, the use of preprints, registered reports, improved data standards, stronger collaboration between researchers and schools, and longer-term investment in independent quality assurance.
These are not technocratic tweaks, but ways of shortening the chain between research and practice without compromising quality. That is the real challenge: working faster while remaining trustworthy. And it will probably always be exactly that: a challenge.