This week I received a bit of online criticism for posting this post on a new report by John Hattie. The comments were twofold. The first comment was related to the statistical complaints Hattie received last year, still as I followed the debate intensively at that time: the conclusions remained correct.
The second remark was interesting: Hattie wrote the new report for Pearson, a company with an agenda, and it would seem unlikely that Pearson would publish something that goes against their agenda. It’s a fair point, although when I read the report, I couldn’t see anything couldn’t be supported by evidence. Some of the stuff in the report you can disagree with, actually I do too. I’ve written a report for a think tank myself this year and I was surprised how much liberty I received. They’ve published the report even if it didn’t suit their agenda (actually it really didn’t at all).
But it’s fair to say you need to be careful, as a new study that I found via Ben Goldacre shows: head-to-head randomized trials are mostly industry sponsored and almost always favor the industry sponsor, a case of ‘who finds me bread and cheese, it’s to his tune I dance’.
I do think it’s something we al need to be careful about. Both as researcher, ‘who do we affiliate with’, both as reader of reports, asking ourselves ‘who is paying’?
And this last question isn’t limited to commercial companies such as Pearson, but also important when looking at Think tanks, even as big as the OECD. And I advice you to also look at where something is published (journals can also have an agenda or be rooted in a paradigm) or who is involved.
In fact: just be careful and critical with everything you read, this text included.