This is a very depressing study: Citations Before and After Successful and Failed Replications

To be clear: the study by Paul von Hippel isn’t disappointing. The results are. As we discuss in our book ‘The Psychology of Great Teaching‘, psychology as a science has seen a huge replication crisis but as Dirk Van Damme notes in his new report:

A very serious challenge in social science research is the so-called “replication crisis,” which indicates that the results of many scientific studies are difficult or impossible to reproduce. A recent study found huge differences in the interest in replicability of research findings across disciplines, with psychology in a far better position than sociology. The replication crisis in social sciences has hit educational research very hard. Already in 2014, an analysis of the top 100 education journals revealed that only 0.13% of the published papers were replications, far less than in any other field of social research. It looks difficult for educational research to adequately respond to the replication crisis.

I left out the last sentence of this citation as it refers to the study that is the subject of this post: “Moreover, it does not appear that replication failure much reduced the influence of nonreplicated findings.”

Or as the von Hippel summarizes it himself:

Open-science reformers have encouraged scholars to conduct more replication studies, hoping that the results will help science to correct itself. If replications carry substantial weight in the research community, then successful replication studies should bolster con- fidence in the replicated findings, and failed replication studies should reduce confidence. The authors of replication studies should be recognized for their contribution.

Our results, though, suggest that many replication studies carry far less weight than advocates for scientific self-correction might hope. Although the OSC’s replica- tion project has done a great deal to increase the rec- ognition of the replication crisis in general, it has done far less to shape confidence in specific psychological findings. On average, we found that replication success or failure had little or no impact on citations of the replicated studies. In the vast majority of cases, the original article continued to be cited approximately as much after the replication attempt as it was before, and more than 95% of articles citing the original study failed to acknowledge that a replication had even been attempted.

Abstract of the study:

In principle, successful replications should enhance the credibility of scientific findings, and failed replications should reduce credibility. Yet it is unknown how replication typically affects the influence of research. We analyzed the citation history of 98 articles. Each was published by a selective psychology journal in 2008 and subjected to a replication attempt published in 2015. Relative to successful replications, failed replications reduced citations of replicated studies by only 5% to 9% on average, an amount that did not differ significantly from zero. Less than 3% of articles citing the original studies cited the replication attempt. It does not appear that replication failure much reduced the influence of nonreplicated findings in psychology. To increase the influence of replications, we recommend (a) requiring authors to cite replication studies alongside the individual findings and (b) enhancing reference databases and search engines to give higher priority to replication studies.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.