A guest post by Jeroen Janssen from Universiteit Utrecht:
According to mindset theory, pupils who have a growth mindset perform better than those with a fixed mindset. Students with a growth mindset believe that their qualities and abilities are not fixed but can improve through practice and effort, for example. This theory has led to educational interventions that make use of this principle.
Psychological Bulletin recently published a highly critical meta-analysis examining the effects of this type of intervention. This meta-analysis by Macnamara and Burgoyne shows that while the effect of this type of intervention on learning outcomes is prima facie significant, it is also very small (d = 0.05). Moreover, the methodological quality of the studies also appeared to be a bit of a problem. When only the studies of the highest quality were analysed, the effect found turned out to be even smaller and non-significant. What is also striking and worrying is that the authors find that when the researchers involved have a financial interest in reporting positive findings, more positive results are also found.
The abstract of the study:
According to mindset theory, students who believe their personal characteristics can change—that is, those who hold a growth mindset—will achieve more than students who believe their characteristics are fixed. Proponents of the theory have developed interventions to influence students’ mindsets, claiming that these interventions lead to large gains in academic achievement. Despite their popularity, the evidence for growth mindset intervention benefits has not been systematically evaluated considering both the quantity and quality of the evidence. Here, we provide such a review by (a) evaluating empirical studies’ adherence to a set of best practices essential for drawing causal conclusions and (b) conducting three meta-analyses. When examining all studies (63 studies, N = 97,672), we found major shortcomings in study design, analysis, and reporting, and suggestions of researcher and publication bias: Authors with a financial incentive to report positive findings published significantly larger effects than authors without this incentive. Across all studies, we observed a small overall effect: d¯ = 0.05, 95% CI = [0.02, 0.09], which was nonsignificant after correcting for potential publication bias. No theoretically meaningful moderators were significant. When examining only studies demonstrating the intervention influenced students’ mindsets as intended (13 studies, N = 18,355), the effect was nonsignificant: d¯ = 0.04, 95% CI = [−0.01, 0.10]. When examining the highest-quality evidence (6 studies, N = 13,571), the effect was nonsignificant: d¯ = 0.02, 95% CI = [−0.06, 0.10]. We conclude that apparent effects of growth mindset interventions on academic achievement are likely attributable to inadequate study design, reporting flaws, and bias.