Does successfully improving students’ achievement test scores lead to higher rates of national economic growth?

No, this isn’t a OECD or PISA-bashing post, but I found a new study by Komatsua and Rappleye via @cbokhove that raises an important question: does successfully improving students’ achievement test scores lead to higher rates of national economic growth. This is a claim based on research by Hanushek and Woessmann and is the basis for a lot of policy-influencing research and policy-advice by e.g. PISA or the World bank. But Komatsua and Rappleye argue now that this claim is maybe based on flawed statistics, as the abstract makes clear what it’s all about:

Several recent, highly influential comparative studies have made strong statistical claims that improvements on global learning assessments such as PISA will lead to higher GDP growth rates. These claims have provided the primary source of legitimation for policy reforms championed by leading international organisations, most notably the World Bank and OECD. To date there have been several critiques but these have been too limited to challenge the validity of the claims. The consequence is continued utilisation and citation of these strong claims, resulting in a growing aura of scientific truth and concrete policy reforms. In this piece we report findings from two original studies that invalidate these statistical claims. Our intent is to contribute to a more rigorous global discussion on education policy, as well as call attention to the fact that the new global policy regime is founded on flawed statistics.

They performed a replication on the same data Hanushek and Woessmann used and their conclusion sounds a bit damning:

Our primary purpose has been to report findings from two studies where our results invalidate H&W’s strong statistical claims that attempt to link student test scores and economic growth. In Study 1, we observed that the explanatory power of test scores was weak in subsequent periods: the relationship between scores and growth has been neither consistently strong nor strongly consistent. In Study 2, we observed that the relationship between changes in test scores in one period and changes in economic growth for subsequent periods were unclear at best, doubtful at worst. Combined, these two original studies do not simply challenge the key statistical claims advanced by H&W but invalidate them because they utilise the same sample, dataset and methods.

But I have to agree with Christian Bokhove in our tweet conversation about this article that both scientists are quite a bit firm in their statements. Still: I think it’s too important not to have further debate on this topic as it’s quite essential to present policy.

1 Comment

Filed under Education, Research, Review

A bit skeptic: Brain activity can be used to predict reading success up to 2 years in advance

This is a study that is both interesting and a bit… well, scary and/or making me skeptic: by measuring brainwaves, it is possible to predict what a child’s reading level will be years in advance, according to research from Binghamton University, State University of New York. The longitudinal study on 75 children did try to measure a lot. Why am I skeptic? Well, it’s rather correlational than clear causal. But also the prediction is one thing, but what can we do with the prediction? The researchers state that it could help to detect pupils who need extra help early, but again: isn’t that assuming that we can alter something? There is also the question how absolute the prediction is and if this is the cheapest solution in the long run. Don’t get me wrong, it’s not my goal to undermine this study, but just some thoughts.

From the press release:

Binghamton University researchers Sarah Laszlo and Mallory Stites measured the brain activity of children and then compared it to their report cards, their vocabulary and other signs of reading success two years later, as part of the National Science Foundation-funded Reading Brain Project. Laszlo and Stites used event-related potentials (ERPs) to determine that brain activity was different in children who showed reading success in later years than in children that did not.

“Your brain is what allows you to do everything, from math to designing buildings to making art,” said Laszlo, associate professor of psychology at Binghamton University. “If we look at what the brain is doing during reading, it is a really good predictor of how reading will develop.”

The children read a list of words silently to themselves. Every so often they would come across their own name to make sure they were understanding the text and paying attention. Children that had better report cards tended to show different patterns of activity during both phonological (sound) and semantic (meaning) processing..

“Phonological processing is the ability to sound things out and semantic processing is knowing what words mean,” said Laszlo. “Like being able to link the word fish with a slimy creature that swims underwater.”

Other factors were included when measuring the reading success of students, such as their teachers, their parents’ encouragement, their age and the amount that they read at home.

“The thing that is really valuable about this is that once kids starting having trouble with reading, they start needing extra help, which can be hard and stigmatizing for the child and often not effective,” said Laszlo. “By using long-range predictions about success, we can give them the extra help they need before they fall behind.”

The team is currently working on a paper to record the findings from the first four years of this research. At the end of the fifth year, Laszlo and her team will look back to see what predictions can be made regarding ERP’s and reading progress.

The paper, “Time will tell: A longitudinal investigation of brain-behavior relationships during reading development,” was published in Psychophysiology.

Abstract of the study:

ERPs are a powerful tool for the study of reading, as they are both temporally precise and functionally specific. These are essential characteristics for studying a process that unfolds rapidly and consists of multiple, interactive subprocesses. In work with adults, clear, specific models exist linking components of the ERP with individual subprocesses of reading including orthographic decoding, phonological processing, and semantic access (e.g., Grainger & Holcomb, 2009). The relationships between ERP components and reading subprocesses are less clear in development; here, we address two questions regarding these relationships. First, we ask whether there are ERP markers that predict future reading behaviors across a longitudinal year. Second, we ask whether any relationships observed between ERP components and reading behavior across time map onto the better-established relationships between ERPs and reading subprocesses in adults. To address these questions, we acquired ERPs from children engaging in a silent reading task and then, a year later, collected behavioral assessments of their reading ability. We find that ERPs collected in Year 1 do predict reading behaviors a year later. Further, we find that these relationships do conform, at least to some extent, to relationships between ERP components and reading subprocesses observed in adults, with, for example, N250 amplitude in Year 1 predicting phonological awareness in Year 2, and N400 amplitude in Year 1 predicting vocabulary in Year 2.

1 Comment

Filed under Education, Research

A Continuum on Personalized Learning: First Draft

This is interesting as Personalized Learning is all the hype, but sometimes is missing the substance and clarity that is needed.

Larry Cuban on School Reform and Classroom Practice

After visiting over three dozen teachers in 11 schools in Silicon Valley and hearing an earful about “personalized learning,” I drafted a continuum where I could locate all of the different versions of “personalized learning” I observed and have read about.

If readers have comments about what’s missing, what needs to be added or how I organized the continuum conceptually, I would surely appreciate hearing from you.

In 2016, when I visited Silicon Valley classrooms, schools and districts, many school administrators and teachers told me that they were personalizing learning. From the Summit network of charter schools to individual teachers at Los Altos and Mountain View High School where Bring Your Own Devices reigned to two Milpitas elementary schools that had upper-grade Learning Labs and rotated students through different stations in all grades, I heard the phrase often.

But I was puzzled by what I saw and heard. When asked…

View original post 2,116 more words

Leave a comment

Filed under Education

A downside of expertise? Skilled workers more prone to mistakes when interrupted

Expertise is great, it’s what makes the difference between a novice and an expert. But every wish comes with a curse, and even expertise has maybe a downside according to this new study as it shows that highly trained workers in some occupations could actually be at risk for making errors when interrupted

From the press release:

The reason: Experienced workers are generally faster at performing procedural tasks, meaning their actions are more closely spaced in time and thus more confusable when they attempt to recall where to resume a task after being interrupted.

“Suppose a nurse is interrupted while preparing to give a dose of medication and then must remember whether he or she administered the dose,” said Erik Altmann, lead investigator on the project. “The more experienced nurse will remember less accurately than a less-practiced nurse, other things being equal, if the more experienced nurse performs the steps involved in administering medication more quickly.”

That’s not to say skilled nurses should avoid giving medication, but only that high skill levels could be a risk factor for increased errors after interruptions and that experts who perform a task quickly and accurately have probably figured out strategies for keeping their place in a task, said Altmann, who collaborated with fellow professor Zach Hambrick.

Their study, funded by the Office of Naval Research, is published online in the Journal of Experimental Psychology: General.

For the experiment, 224 people performed two sessions of a computer-based procedural task on separate days. Participants were interrupted randomly by a simple typing task, after which they had to remember the last step they performed to select the correct step to perform next.

In the second session, people became faster, and on most measures, more accurate, Altmann said. After interruptions, however, they became less accurate, making more errors by resuming the task at the wrong spot.

“The faster things happen, the worse we remember them,” Altmann said, adding that when workers are interrupted in the middle of critical procedures, as in emergency rooms or intensive care units, they may benefit from training and equipment design that helps them remember where they left off.

Abstract of the study:

Positive effects of practice are ubiquitous in human performance, but a finding from memory research suggests that negative effects are possible also. The finding is that memory for items on a list depends on the time interval between item presentations. This finding predicts a negative effect of practice on procedural performance under conditions of task interruption. As steps of a procedure are performed more quickly, memory for past performance should become less accurate, increasing the rate of skipped or repeated steps after an interruption. We found this effect, with practice generally improving speed and accuracy, but impairing accuracy after interruptions. The results show that positive effects of practice can interact with architectural constraints on episodic memory to have negative effects on performance. In practical terms, the results suggest that practice can be a risk factor for procedural errors in task environments with a high incidence of task interruption.

1 Comment

Filed under At work, Research

Funny on Sunday: once a teacher…

Always a teacher, even if you’re Hugh Jackman. “That awkward moment when Hugh Jackman remembers he taught you at school.”

Leave a comment

Filed under Funny

Ready for some debate? Study suggests too much structured knowledge can hurt creativity

Ok, this is a study that can stir some debate, imho. This study from the University of Toronto’s Rotman School of Management suggests that while structure organizes human activities and help us understand the world with less effort, it also can be the killer of creativity. But there are also some elements you have to take into account before jumping to conclusions, it’s not really about learning or creativity in education as such.

The study consists of 3 different experiment – with one based on using Lego bricks – and has this as conclusion:

In three studies, the current research showed that individuals presented with a flat information structure were more creative compared to those presented with a hierarchical information structure. We define a flat information structure as a set of information that is presented without higher-order categories, and a hierarchical information structure as a set of information organized by higher-order categories. We found that the increased creativity in a flat information structure condition is due to an elevated level of cognitive flexibility. Additionally, exploratory analyses showed that participants in the flat information structure condition spent longer time on their tasks than those in the hierarchical information structure condition. Given that time spent could be a proxy for how cognitively persistent participants were on the task, this suggests that the absence of structure might also increase creativity through cognitive persistence.

If you want to know what hierarchical information structure versu flat information structure means, just check this picture from the original study:

So people are more creative when the Lego bricks were given in a flat information structure. Interesting, but to be frank, I think there is a big jump from an experimental setting to real life situations in e.g. a company setting in this press release:

While most management research has supported the idea that giving structure to information makes it easier to cope with its complexity and boosts efficiency, the paper says that comes as a double-edged sword.

“A hierarchically organized information structure may also have a dark side,” warns Yeun Joon Kim, a PhD student who co-authored the paper with Chen-Bo Zhong, an associate professor of organizational behaviour and human resource management at the Rotman School.

The researchers showed in a series of experiments that participants displayed less creativity and cognitive flexibility when asked to complete tasks using categorized sets of information, compared to those asked to work with items that were not ordered in any special way. Those in the organized information group also spent less time on their tasks, suggesting reduced persistence, a key ingredient for creativity.

The researchers ran three experiments. In two, study participants were presented with a group of nouns that were either organized into neat categories or not, and then told to make as many sentences as they could with them.

The third experiment used LEGO® bricks. Participants were asked to make an alien out of a box of bricks organized by colour and shape or, in a scenario familiar to many parents, out of a box of unorganized bricks. Participants in the organized category were prohibited from dumping the bricks out onto a table.

The findings may have application for leaders of multi-disciplinary teams, which tend to show inconsistent rates of innovation, perhaps because team members may continue to organize their ideas according to functional similarity, area of their expertise, or discipline.

“We suggest people put their ideas randomly on a white board and then think about some of their connections,” says Kim. Our tendency to categorize information rather than efficiency itself is what those working in creative industries need to be most on guard about, the researchers say.

Ehm, ok, in that last paragraph it’s almost as the researchers suggest brainstorming, but we’ve known that brainstorming in itself is not the best way to find creative results. I do think that this study is interesting in it’s own right, but that we should be careful with the possible implications.

Abstract of the study:

Is structure good or bad for creativity? When it comes to organizing information, management scholars have long advocated for a hierarchical information structure (information organized around higher-order categories as opposed to a flat information structure where there is no higher-order category) to reduce complexity of information processing and increase efficiency of work. However, a hierarchical information structure can be a double-edged sword that may reduce creativity, defined as novel and useful combination of existing information. This is because a hierarchical information structure might obstruct combining information from distal conceptual categories. Thus, the current research investigates whether information structure influences creativity. We theorize that a hierarchical information structure, compared to a flat information structure, will reduce creativity because it reduces cognitive flexibility. Three experiments using a sentence construction task and a LEGO task supported our prediction.

1 Comment

Filed under Education, Research

Important new meta-analysis on the testing effect – with some surprises…

There is a new Best Evidence in Brief and one of the studies the newsletter discusses is all about the effect of testing:

Olusola O. Adesope and colleagues conducted a meta-analysis to summarize the learning benefits of taking a practice test versus other forms of non-testing learning conditions, such as re-studying, practice, filler activities, or no presentation of the material.

Results from 272 independent effects from 188 separate experiments demonstrated that the use of practice tests is associated with a moderate, statistically significant weighted mean effect size compared to re-studying (+0.51) and a much larger weighted mean effect size (+ 0.93) when compared to filler or no activities.

In addition, the format, number, and frequency of practice tests make a difference for the learning benefits on a final test. Practice tests with a multiple-choice option have a larger weighted mean effect size (+0.70) than short-answer tests (+0.48). A single practice test prior to the final test is more effective than when students take several practice tests. However, the timing should be carefully considered. A gap of less than a day between the practice and final tests showed a smaller weighted effect size than when there is a gap of one to six days (+0.56 and + 0.82, respectively).  

Are you as baffled by these results as I am? I checked in the original article to find out more.

Eg. MC-questions having a bigger effect size – while remembering is often harder than recognizing? Well, there are some studies who actually support the latter:

For instance, Kang et al. (2007) revealed that students who took a short-answer practice test outperformed students who took a multiple-choice practice test on the final test, regardless of whether the final test was short-answer or multiple-choice.

But:

On the other hand, C. D. Morris, Bransford, and Franks’s (1977) research on levels of processing suggests that retention is strongest when processing demands are less demanding. They reason that this is because less demanding retrieval practice activities allow participants to focus all of their cognitive energy on a simple task at hand, whereas deeper levels of processing require more cognitive energy and can distract participants from relevant aspects (C. D. Morris et al., 1977).

And looking at the meta-analysis, the second theory seems to be winning as “the differences between multiple-choice and short-answer practice test formats did emerge: g = 0.70 and g = 0.48, respectively” But it’s worth noting that the researchers do warn it’s not that simple:

…we found that multiple-choice testing was the most effective format; however, this should be interpreted with caution, since an educator’s decision to use any given format should be based on the content of the learning material and the expected learning outcomes. For example, multiple- choice tests may be especially useful for memorization and fact retention, while short-answer testing may require more higher order thinking skills that are useful for more conceptual and abstract learning content

And what about a single test being more effective than taking several practice tests? The meta-analysis does support this, but Adesope et al. can only guess why:

Thus, our findings suggest that although a single test prior to a final test may result in better performance, the timing of the test should be carefully considered. One plausible explanation is more time between the practice and final tests allows students to mentally recall and process information, leading to deeper learning. An alternative hypothesis is that multiple tests within a short time may result in test fatigue that affects performance, while retrieval practice over a distributed time period enables long-term storage.

I do think that this meta-analysis will invite other researcher do join the debate…

Abstract of the study:

The testing effect is a well-known concept referring to gains in learning and retention that can occur when students take a practice test on studied material before taking a final test on the same material. Research demonstrates that students who take practice tests often outperform students in nontesting learning conditions such as restudying, practice, filler activities, or no presentation of the material. However, evidence-based meta-analysis is needed to develop a comprehensive understanding of the conditions under which practice tests enhance or inhibit learning. This meta-analysis fills this gap by examining the effects of practice tests versus nontesting learning conditions. Results reveal that practice tests are more beneficial for learning than restudying and all other comparison conditions. Mean effect sizes were moderated by the features of practice tests, participant and study characteristics, outcome constructs, and methodological features of the studies. Findings may guide the use of practice tests to advance student learning, and inform students, teachers, researchers, and policymakers. This article concludes with the theoretical and practical implications of the meta-analysis.

1 Comment

Filed under Education, Research, Review

Some good news? “Critical thinking instruction in humanities reduces belief in pseudoscience”

Sometimes it’s quite difficult for (educational) mythbusters: what if you only make it worse. There are some studies – also mentioned in our book – who say so. But this new study – although with a very specific group and with a rather small amount of participants – at least suggests there are opportunities to battle pseudoscience…

From the press release:

A recent study by North Carolina State University researchers finds that teaching critical thinking skills in a humanities course significantly reduces student beliefs in “pseudoscience” that is unsupported by facts.

“Given the national discussion of ‘fake news,’ it’s clear that critical thinking – and classes that teach critical thinking – are more important than ever,” says Anne McLaughlin, an associate professor of psychology at NC State and co-author of a paper describing the work.

“Fundamentally, we wanted to assess how intentional you have to be when teaching students critical thinking,” says Alicia McGill, an assistant professor of history at NC State and co-author of the paper. “We also wanted to explore how humanities classes can play a role and whether one can assess the extent to which critical thinking instruction actually results in improved critical thinking by students.

“This may be especially timely, because humanities courses give students tools they can use to assess qualitative data and sort through political rhetoric,” McGill says. “Humanities also offer us historical and cultural perspective that allow us to put current events into context.”

For this study, the researchers worked with 117 students in three different classes. Fifty-nine students were enrolled in a psychology research methods course, which taught statistics and study design, but did not specifically address critical thinking. The other 58 students were enrolled in one of two courses on historical frauds and mysteries – one of which included honors students, many of whom were majors in science, engineering and mathematics disciplines.

The psychology class served as a control group. The two history courses incorporated instruction explicitly designed to cultivate critical thinking skills. For example, students in the history courses were taught how to identify logical fallacies – statements that violate logical arguments, such as non sequiturs.

At the beginning of the semester, students in all three courses took a baseline assessment of their beliefs in pseudoscientific claims. The assessment used a scale from 1 (“I don’t believe at all.”) to 7 (“I strongly believe.”).

Some of the topics in the assessment, such as belief in Atlantis, were later addressed in the “historical frauds” course. Other topics, such as the belief that 9/11 was an “inside job,” were never addressed in the course. This allowed the researchers to determine the extent to which changes in student beliefs stemmed from specific facts discussed in class, versus changes in a student’s critical thinking skills.

At the end of the semester, students took the pseudoscience assessment again.

The control group students did not change their beliefs – but students in both history courses had lower beliefs in pseudoscience by the end of the semester.

Students in the history course for honors students decreased the most in their pseudoscientific beliefs; on average, student beliefs dropped an entire point on the belief scale for topics covered in class, and by 0.5 points on topics not covered in class. There were similar, but less pronounced, changes in the non-honors course.

“The change we see in these students is important, because beliefs are notoriously hard to change,” says McLaughlin. “And seeing students apply critical thinking skills to areas not covered in class is particularly significant and heartening.”

“It’s also important to note that these results stem from taking only one class,” McGill says. “Consistent efforts to teach critical thinking across multiple classes may well have more pronounced effects.

“This drives home the importance of teaching critical thinking, and the essential role that humanities can play in that process,” McGill says. “This is something that NC State is actively promoting as part of a universitywide focus on critical thinking development.”

The paper, “Explicitly teaching critical thinking skills in a history course,” was published March 20 in the journal Science & Education.

Abstract of the study:

Critical thinking skills are often assessed via student beliefs in non-scientific ways of thinking, (e.g, pseudoscience). Courses aimed at reducing such beliefs have been studied in the STEM fields with the most successful focusing on skeptical thinking. However, critical thinking is not unique to the sciences; it is crucial in the humanities and to historical thinking and analysis. We investigated the effects of a history course on epistemically unwarranted beliefs in two class sections. Beliefs were measured pre- and post-semester. Beliefs declined for history students compared to a control class and the effect was strongest for the honors section. This study provides evidence that a humanities education engenders critical thinking. Further, there may be individual differences in ability or preparedness in developing such skills, suggesting different foci for critical thinking coursework.

1 Comment

Filed under Education, Media literacy, Myths, Research

Why do we accept so many insults in education?

The past few years I’ve seen it happening over and over again. You find yourself in a room full of teachers or principals listening to a speaker – usually from outside education – who in a charismatic and funny way, insults the entire audience, calling them sheep, obsolete, redundant … You can see the same happening in often shared videos, such as the videos by Ken Robinson or Prince EA. These videos don’t have to be 50% correct (eg Robinson ) or they can even be completely wrong (eg Prince EA ), but they are still massively shared, often by teachers and principals themselves.

This may partly be due to a self-critical spirit in which teachers themselves think education should do better. It could stem from their own discomfort about how certain things are happening in education. True, sometimes it can be a good idea if someone from the outside looks at what you’re doing, but let this than be someone who goes beyond phraseology.

But … I see something else too little. Much too little. I’m talking about teachers and principals who proudly share what they are doing. Who answer gurus that they don’t know what they are talking about, but that these gurus should not hesitate to come around for a few months to try to teach themselves. Who tell techno freaks that their utopian image will become obsolete within two years and that they overlook the inequality happening in their dreams, while education should help all children.

There are many wonderful things happening in education, enough to be proud of. There remains plenty of work to do – as in every field – not to be complacent. But with a little more pride and self-awareness, we would perhaps be able to ensure that more people choose to become a teacher.

We won’t be able to solve the teacher shortage all by ourselves – the government also has work on the shelf – but it is the part that we can do. Today, the Global Teacher Price will be announced. This is a good initiative to put an emphasis on the great education does. But let us all show that the winner is no exception.

1 Comment

Filed under Education

Funny on Sunday: The Four Horsemen of the Modern Apocalypse

No comment:

Leave a comment

Filed under Funny