They appear frequently on this blog. John Hattie made them world-famous in education. Meta-analyses are often seen as the highest level of scientific evidence. They combine the results of dozens or even hundreds of studies and try to distil one clear answer: Does this approach work or not? For anyone looking for guidance in the jungle of educational research, that sounds very appealing. One overview, one conclusion, done.
But there are other questions we can ask. One of them is simple: what does such a conclusion actually mean for classroom practice?
Another recent meta-review by Pellegrini and colleagues focused precisely on that issue. The researchers analysed 103 meta-analyses of school interventions and did not assess whether they were methodologically sound, but rather whether they were relevant, applicable, and understandable for teachers and policymakers.
Their findings are not an indictment of meta-analyses. For anyone hoping that I—or the authors—would bury them here: think again. Rather, this review is an invitation to look at them differently.
To begin with, scienties rarely design meta-analyses together with education professionals. In more than 80 per cent of the cases, there was no involvement of teachers or schools in formulating the research questions or interpreting the results. That is not surprising. Meta-analyses are usually academic products. But it does help explain why they often answer scientific questions (“is there an average effect?”) rather than practical ones (“when does this work, for whom, and under what conditions?”).
In addition, many meta-analyses lack precisely the information schools need to make decisions. They almost always report grade level and subject area, but rarely things such as:
-
How much training do teachers need
-
What the programme costs
-
how intensively it must be implemented
-
In what school contexts has it been studied
In other words, we often know that something works on average, but not whether it fits the reality of a particular school. This is what I sometimes call the package insert of educational interventions.
The way articles present results is also mainly geared towards researchers. Forest plots and standardised effect sizes are familiar territory in academic journals, but difficult to interpret for non-specialists. Only a small proportion of meta-analyses translate effects into more accessible measures such as percentiles or learning gains. And only about half explicitly discuss what their findings might mean for practice.
That may sound like a shortcoming, but it can also be read differently. Meta-analyses are strong at what they were originally designed to do: identifying patterns across many studies. They show that effects vary, that context matters, and that “what works” rarely has a single simple answer. In that sense, they are more like maps than route instructions. They show the landscape, but not the exact path that every school should take.
The authors of the meta-review, therefore, do not argue for fewer meta-analyses but for stronger connections with practice. Researchers can achieve this by:
-
reporting relevant contextual features more often
-
analysing moderators that are meaningful for schools
-
visualising results in more accessible ways
-
and being explicit about where uncertainty remains
In this way, meta-analyses can become a more powerful tool for evidence-informed education.
Meta-analyses are therefore not an endpoint, but a starting point. They can guide thinking and discussion, but they do not replace professional judgement. They do not say “do this”, but rather: “here are the patterns—now it is up to you to translate them into your own context.”
Those who read meta-analyses in this way—not as recipe books but as frameworks for thinking—are likely to get exactly what they can genuinely offer: a better understanding of what might work, and why it sometimes does not. That is not a weakness. It is a mature science in a complex educational landscape.
Share this:
- Share on X (Opens in new window) X
- Share on Mastodon (Opens in new window) Mastodon
- Share on Bluesky (Opens in new window) Bluesky
- Share on Threads (Opens in new window) Threads
- Share on Facebook (Opens in new window) Facebook
- Share on Telegram (Opens in new window) Telegram
- Share on WhatsApp (Opens in new window) WhatsApp
- Share on Pocket (Opens in new window) Pocket
- Email a link to a friend (Opens in new window) Email
- Share on Tumblr (Opens in new window) Tumblr
- Share on Reddit (Opens in new window) Reddit
- Share on Pinterest (Opens in new window) Pinterest
- Share on LinkedIn (Opens in new window) LinkedIn
- Share on Nextdoor (Opens in new window) Nextdoor