2 relevant items in Best Evidence in Brief: no effect mindfulness & effect sizes

There is a new Best Evidence in Brief (they have a blog now too) and this time it was too difficult to pick just one item, so I had to pick two.

The first one is on effect sizes in educational research:

What difference does it make?
We regularly quote effect sizes in Best Evidence in Brief as a measure of the impact of an intervention or approach. But what is the impact of a normal school year on children, and how much of that impact is due to the school? A study by Hans Lutyen and colleagues, published in School Effectiveness and School Improvement, attempts to find out.
The study analyzed 3,500 students from 20 mostly independent (private) English primary schools on four different learning outcomes. These measures, part of the Interactive Computer Adaptive System (InCAS), were reading, general math, mental arithmetic, and developed ability, the last of which measures items such as vocabulary and non-verbal pattern recognition.
Children were measured on these outcomes from Years 1 to 6 (Kindergarten to 5th grade in the U.S.). Using a regression-discontinuity approach that exploited the discontinuity between the youngest students in one year and the oldest students in the year below, the researchers were able to identify the overall progress of the children, and the extent to which this was a result of the impact of the school.
The results showed a declining impact of a school year as children got older. The effect size of Year 1 ranged from +1.18 for mental arithmetic to +0.8 for general math. By Year 6, effect sizes varied from +0.88 for general math to +0.49 for reading and developed ability.
The effect of schooling itself accounted for an average of between 23.5% and 43.4% of this impact across the four measures. Put another way, the effect size of schooling in Year 1 ranged from +0.55 for reading to +0.31 for developed ability. By Year 6, effect sizes had fallen to between +0.27 for general math and +0.08 for reading and developed ability.
The researchers suggest that, when setting benchmarks for educational interventions, it is not only important to consider the phase of the educational career, but also the specific measure.

The second is on the effect of mindfulness in secondary education (check this post too):

Impact of secondary school mindfulness programs
Catherine Johnson and colleagues carried out a randomized controlled evaluation of a secondary school mindfulness program (called “.b mindfulness” for “Stop, Breathe and Be!”) to measure impact on self-reported measures of anxiety, depression, weight/shape concerns, well-being, and mindfulness.
Five hundred and fifty-five students in four secondary schools in South Australia participated (mean age = 13.44 years). Students were assigned using a cluster (class-based) randomized controlled design to one of three conditions: the nine-week mindfulness curriculum, the nine-week mindfulness curriculum with parental involvement, or a control (business-as-usual) curriculum.
The evaluation found no differences between the mindfulness groups with or without parental involvement and the control group at post-intervention or at the six and twelve month follow-up. The researchers conclude that further research is required to identify the optimal age, content, and length of programs delivering mindfulness to teenagers.

1 Comment

Filed under Education, Research, Review

One response to “2 relevant items in Best Evidence in Brief: no effect mindfulness & effect sizes

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.