Yesterday I had to miss the debate on meta-analyses on #rED18 but I did read the post by Robert Coe.
It’s true there has been quite a stir about Hattie and meta-analyses lately, and to me there are different aspects to the discussion.
I did notice that when effect sizes are shown in a different way, people can spot the complexity that’s often being obscured.
Compare the way Hattie notes effect sizes in his infamous lists of effects, e.g.:
And compare that with this graph, taken from Dietrichson et al., 2017.:
In this second example you can see the range of effects that are hidden behind the average effect size. It’s still an abstraction of a more complex reality, but it invites people who are interested to check what makes the difference between effect sizes noted for small-group instruction so big, or it shows that while coaching and mentoring students can have a positive effect, there is also a danger of the opposite.
- Dietrichson, J., Bøg, M., Filges, T., & Klint Jørgensen, A. M. (2017). Academic interventions for elementary and middle school students with low socioeconomic status: A systematic review and meta-analysis. Review of Educational Research, 87(2), 243-282.