A downside of expertise? Skilled workers more prone to mistakes when interrupted

Expertise is great, it’s what makes the difference between a novice and an expert. But every wish comes with a curse, and even expertise has maybe a downside according to this new study as it shows that highly trained workers in some occupations could actually be at risk for making errors when interrupted

From the press release:

The reason: Experienced workers are generally faster at performing procedural tasks, meaning their actions are more closely spaced in time and thus more confusable when they attempt to recall where to resume a task after being interrupted.

“Suppose a nurse is interrupted while preparing to give a dose of medication and then must remember whether he or she administered the dose,” said Erik Altmann, lead investigator on the project. “The more experienced nurse will remember less accurately than a less-practiced nurse, other things being equal, if the more experienced nurse performs the steps involved in administering medication more quickly.”

That’s not to say skilled nurses should avoid giving medication, but only that high skill levels could be a risk factor for increased errors after interruptions and that experts who perform a task quickly and accurately have probably figured out strategies for keeping their place in a task, said Altmann, who collaborated with fellow professor Zach Hambrick.

Their study, funded by the Office of Naval Research, is published online in the Journal of Experimental Psychology: General.

For the experiment, 224 people performed two sessions of a computer-based procedural task on separate days. Participants were interrupted randomly by a simple typing task, after which they had to remember the last step they performed to select the correct step to perform next.

In the second session, people became faster, and on most measures, more accurate, Altmann said. After interruptions, however, they became less accurate, making more errors by resuming the task at the wrong spot.

“The faster things happen, the worse we remember them,” Altmann said, adding that when workers are interrupted in the middle of critical procedures, as in emergency rooms or intensive care units, they may benefit from training and equipment design that helps them remember where they left off.

Abstract of the study:

Positive effects of practice are ubiquitous in human performance, but a finding from memory research suggests that negative effects are possible also. The finding is that memory for items on a list depends on the time interval between item presentations. This finding predicts a negative effect of practice on procedural performance under conditions of task interruption. As steps of a procedure are performed more quickly, memory for past performance should become less accurate, increasing the rate of skipped or repeated steps after an interruption. We found this effect, with practice generally improving speed and accuracy, but impairing accuracy after interruptions. The results show that positive effects of practice can interact with architectural constraints on episodic memory to have negative effects on performance. In practical terms, the results suggest that practice can be a risk factor for procedural errors in task environments with a high incidence of task interruption.

Leave a comment

Filed under At work, Research

Funny on Sunday: once a teacher…

Always a teacher, even if you’re Hugh Jackman. “That awkward moment when Hugh Jackman remembers he taught you at school.”

Leave a comment

Filed under Funny

Ready for some debate? Study suggests too much structured knowledge can hurt creativity

Ok, this is a study that can stir some debate, imho. This study from the University of Toronto’s Rotman School of Management suggests that while structure organizes human activities and help us understand the world with less effort, it also can be the killer of creativity. But there are also some elements you have to take into account before jumping to conclusions, it’s not really about learning or creativity in education as such.

The study consists of 3 different experiment – with one based on using Lego bricks – and has this as conclusion:

In three studies, the current research showed that individuals presented with a flat information structure were more creative compared to those presented with a hierarchical information structure. We define a flat information structure as a set of information that is presented without higher-order categories, and a hierarchical information structure as a set of information organized by higher-order categories. We found that the increased creativity in a flat information structure condition is due to an elevated level of cognitive flexibility. Additionally, exploratory analyses showed that participants in the flat information structure condition spent longer time on their tasks than those in the hierarchical information structure condition. Given that time spent could be a proxy for how cognitively persistent participants were on the task, this suggests that the absence of structure might also increase creativity through cognitive persistence.

If you want to know what hierarchical information structure versu flat information structure means, just check this picture from the original study:

So people are more creative when the Lego bricks were given in a flat information structure. Interesting, but to be frank, I think there is a big jump from an experimental setting to real life situations in e.g. a company setting in this press release:

While most management research has supported the idea that giving structure to information makes it easier to cope with its complexity and boosts efficiency, the paper says that comes as a double-edged sword.

“A hierarchically organized information structure may also have a dark side,” warns Yeun Joon Kim, a PhD student who co-authored the paper with Chen-Bo Zhong, an associate professor of organizational behaviour and human resource management at the Rotman School.

The researchers showed in a series of experiments that participants displayed less creativity and cognitive flexibility when asked to complete tasks using categorized sets of information, compared to those asked to work with items that were not ordered in any special way. Those in the organized information group also spent less time on their tasks, suggesting reduced persistence, a key ingredient for creativity.

The researchers ran three experiments. In two, study participants were presented with a group of nouns that were either organized into neat categories or not, and then told to make as many sentences as they could with them.

The third experiment used LEGO® bricks. Participants were asked to make an alien out of a box of bricks organized by colour and shape or, in a scenario familiar to many parents, out of a box of unorganized bricks. Participants in the organized category were prohibited from dumping the bricks out onto a table.

The findings may have application for leaders of multi-disciplinary teams, which tend to show inconsistent rates of innovation, perhaps because team members may continue to organize their ideas according to functional similarity, area of their expertise, or discipline.

“We suggest people put their ideas randomly on a white board and then think about some of their connections,” says Kim. Our tendency to categorize information rather than efficiency itself is what those working in creative industries need to be most on guard about, the researchers say.

Ehm, ok, in that last paragraph it’s almost as the researchers suggest brainstorming, but we’ve known that brainstorming in itself is not the best way to find creative results. I do think that this study is interesting in it’s own right, but that we should be careful with the possible implications.

Abstract of the study:

Is structure good or bad for creativity? When it comes to organizing information, management scholars have long advocated for a hierarchical information structure (information organized around higher-order categories as opposed to a flat information structure where there is no higher-order category) to reduce complexity of information processing and increase efficiency of work. However, a hierarchical information structure can be a double-edged sword that may reduce creativity, defined as novel and useful combination of existing information. This is because a hierarchical information structure might obstruct combining information from distal conceptual categories. Thus, the current research investigates whether information structure influences creativity. We theorize that a hierarchical information structure, compared to a flat information structure, will reduce creativity because it reduces cognitive flexibility. Three experiments using a sentence construction task and a LEGO task supported our prediction.

1 Comment

Filed under Education, Research

Important new meta-analysis on the testing effect – with some surprises…

There is a new Best Evidence in Brief and one of the studies the newsletter discusses is all about the effect of testing:

Olusola O. Adesope and colleagues conducted a meta-analysis to summarize the learning benefits of taking a practice test versus other forms of non-testing learning conditions, such as re-studying, practice, filler activities, or no presentation of the material.

Results from 272 independent effects from 188 separate experiments demonstrated that the use of practice tests is associated with a moderate, statistically significant weighted mean effect size compared to re-studying (+0.51) and a much larger weighted mean effect size (+ 0.93) when compared to filler or no activities.

In addition, the format, number, and frequency of practice tests make a difference for the learning benefits on a final test. Practice tests with a multiple-choice option have a larger weighted mean effect size (+0.70) than short-answer tests (+0.48). A single practice test prior to the final test is more effective than when students take several practice tests. However, the timing should be carefully considered. A gap of less than a day between the practice and final tests showed a smaller weighted effect size than when there is a gap of one to six days (+0.56 and + 0.82, respectively).  

Are you as baffled by these results as I am? I checked in the original article to find out more.

Eg. MC-questions having a bigger effect size – while remembering is often harder than recognizing? Well, there are some studies who actually support the latter:

For instance, Kang et al. (2007) revealed that students who took a short-answer practice test outperformed students who took a multiple-choice practice test on the final test, regardless of whether the final test was short-answer or multiple-choice.

But:

On the other hand, C. D. Morris, Bransford, and Franks’s (1977) research on levels of processing suggests that retention is strongest when processing demands are less demanding. They reason that this is because less demanding retrieval practice activities allow participants to focus all of their cognitive energy on a simple task at hand, whereas deeper levels of processing require more cognitive energy and can distract participants from relevant aspects (C. D. Morris et al., 1977).

And looking at the meta-analysis, the second theory seems to be winning as “the differences between multiple-choice and short-answer practice test formats did emerge: g = 0.70 and g = 0.48, respectively” But it’s worth noting that the researchers do warn it’s not that simple:

…we found that multiple-choice testing was the most effective format; however, this should be interpreted with caution, since an educator’s decision to use any given format should be based on the content of the learning material and the expected learning outcomes. For example, multiple- choice tests may be especially useful for memorization and fact retention, while short-answer testing may require more higher order thinking skills that are useful for more conceptual and abstract learning content

And what about a single test being more effective than taking several practice tests? The meta-analysis does support this, but Adesope et al. can only guess why:

Thus, our findings suggest that although a single test prior to a final test may result in better performance, the timing of the test should be carefully considered. One plausible explanation is more time between the practice and final tests allows students to mentally recall and process information, leading to deeper learning. An alternative hypothesis is that multiple tests within a short time may result in test fatigue that affects performance, while retrieval practice over a distributed time period enables long-term storage.

I do think that this meta-analysis will invite other researcher do join the debate…

Abstract of the study:

The testing effect is a well-known concept referring to gains in learning and retention that can occur when students take a practice test on studied material before taking a final test on the same material. Research demonstrates that students who take practice tests often outperform students in nontesting learning conditions such as restudying, practice, filler activities, or no presentation of the material. However, evidence-based meta-analysis is needed to develop a comprehensive understanding of the conditions under which practice tests enhance or inhibit learning. This meta-analysis fills this gap by examining the effects of practice tests versus nontesting learning conditions. Results reveal that practice tests are more beneficial for learning than restudying and all other comparison conditions. Mean effect sizes were moderated by the features of practice tests, participant and study characteristics, outcome constructs, and methodological features of the studies. Findings may guide the use of practice tests to advance student learning, and inform students, teachers, researchers, and policymakers. This article concludes with the theoretical and practical implications of the meta-analysis.

1 Comment

Filed under Education, Research, Review

Some good news? “Critical thinking instruction in humanities reduces belief in pseudoscience”

Sometimes it’s quite difficult for (educational) mythbusters: what if you only make it worse. There are some studies – also mentioned in our book – who say so. But this new study – although with a very specific group and with a rather small amount of participants – at least suggests there are opportunities to battle pseudoscience…

From the press release:

A recent study by North Carolina State University researchers finds that teaching critical thinking skills in a humanities course significantly reduces student beliefs in “pseudoscience” that is unsupported by facts.

“Given the national discussion of ‘fake news,’ it’s clear that critical thinking – and classes that teach critical thinking – are more important than ever,” says Anne McLaughlin, an associate professor of psychology at NC State and co-author of a paper describing the work.

“Fundamentally, we wanted to assess how intentional you have to be when teaching students critical thinking,” says Alicia McGill, an assistant professor of history at NC State and co-author of the paper. “We also wanted to explore how humanities classes can play a role and whether one can assess the extent to which critical thinking instruction actually results in improved critical thinking by students.

“This may be especially timely, because humanities courses give students tools they can use to assess qualitative data and sort through political rhetoric,” McGill says. “Humanities also offer us historical and cultural perspective that allow us to put current events into context.”

For this study, the researchers worked with 117 students in three different classes. Fifty-nine students were enrolled in a psychology research methods course, which taught statistics and study design, but did not specifically address critical thinking. The other 58 students were enrolled in one of two courses on historical frauds and mysteries – one of which included honors students, many of whom were majors in science, engineering and mathematics disciplines.

The psychology class served as a control group. The two history courses incorporated instruction explicitly designed to cultivate critical thinking skills. For example, students in the history courses were taught how to identify logical fallacies – statements that violate logical arguments, such as non sequiturs.

At the beginning of the semester, students in all three courses took a baseline assessment of their beliefs in pseudoscientific claims. The assessment used a scale from 1 (“I don’t believe at all.”) to 7 (“I strongly believe.”).

Some of the topics in the assessment, such as belief in Atlantis, were later addressed in the “historical frauds” course. Other topics, such as the belief that 9/11 was an “inside job,” were never addressed in the course. This allowed the researchers to determine the extent to which changes in student beliefs stemmed from specific facts discussed in class, versus changes in a student’s critical thinking skills.

At the end of the semester, students took the pseudoscience assessment again.

The control group students did not change their beliefs – but students in both history courses had lower beliefs in pseudoscience by the end of the semester.

Students in the history course for honors students decreased the most in their pseudoscientific beliefs; on average, student beliefs dropped an entire point on the belief scale for topics covered in class, and by 0.5 points on topics not covered in class. There were similar, but less pronounced, changes in the non-honors course.

“The change we see in these students is important, because beliefs are notoriously hard to change,” says McLaughlin. “And seeing students apply critical thinking skills to areas not covered in class is particularly significant and heartening.”

“It’s also important to note that these results stem from taking only one class,” McGill says. “Consistent efforts to teach critical thinking across multiple classes may well have more pronounced effects.

“This drives home the importance of teaching critical thinking, and the essential role that humanities can play in that process,” McGill says. “This is something that NC State is actively promoting as part of a universitywide focus on critical thinking development.”

The paper, “Explicitly teaching critical thinking skills in a history course,” was published March 20 in the journal Science & Education.

Abstract of the study:

Critical thinking skills are often assessed via student beliefs in non-scientific ways of thinking, (e.g, pseudoscience). Courses aimed at reducing such beliefs have been studied in the STEM fields with the most successful focusing on skeptical thinking. However, critical thinking is not unique to the sciences; it is crucial in the humanities and to historical thinking and analysis. We investigated the effects of a history course on epistemically unwarranted beliefs in two class sections. Beliefs were measured pre- and post-semester. Beliefs declined for history students compared to a control class and the effect was strongest for the honors section. This study provides evidence that a humanities education engenders critical thinking. Further, there may be individual differences in ability or preparedness in developing such skills, suggesting different foci for critical thinking coursework.

1 Comment

Filed under Education, Media literacy, Myths, Research

Why do we accept so many insults in education?

The past few years I’ve seen it happening over and over again. You find yourself in a room full of teachers or principals listening to a speaker – usually from outside education – who in a charismatic and funny way, insults the entire audience, calling them sheep, obsolete, redundant … You can see the same happening in often shared videos, such as the videos by Ken Robinson or Prince EA. These videos don’t have to be 50% correct (eg Robinson ) or they can even be completely wrong (eg Prince EA ), but they are still massively shared, often by teachers and principals themselves.

This may partly be due to a self-critical spirit in which teachers themselves think education should do better. It could stem from their own discomfort about how certain things are happening in education. True, sometimes it can be a good idea if someone from the outside looks at what you’re doing, but let this than be someone who goes beyond phraseology.

But … I see something else too little. Much too little. I’m talking about teachers and principals who proudly share what they are doing. Who answer gurus that they don’t know what they are talking about, but that these gurus should not hesitate to come around for a few months to try to teach themselves. Who tell techno freaks that their utopian image will become obsolete within two years and that they overlook the inequality happening in their dreams, while education should help all children.

There are many wonderful things happening in education, enough to be proud of. There remains plenty of work to do – as in every field – not to be complacent. But with a little more pride and self-awareness, we would perhaps be able to ensure that more people choose to become a teacher.

We won’t be able to solve the teacher shortage all by ourselves – the government also has work on the shelf – but it is the part that we can do. Today, the Global Teacher Price will be announced. This is a good initiative to put an emphasis on the great education does. But let us all show that the winner is no exception.

1 Comment

Filed under Education

Funny on Sunday: The Four Horsemen of the Modern Apocalypse

No comment:

Leave a comment

Filed under Funny

Does phonics help or hinder comprehension?

Sometimes it seems the reading wars just can’t be solved, despite the many clear pieces of evidence such as mentioned in this great post.

thinkingreadingwritings

A recent TES article headlined “Call for researchers to highlight negative ‘side effects’ of methods like phonics” drew a predictable response. Though the article supplied not one piece of evidence to support the assertion that phonics had “negative side effects”, and despite the academic quoted having zero background or expertise in reading science, tweets and comments celebrated this damning of the barbaric practice of phonics in schools.

Both the article and the responses illustrates the strong prejudices that have to be overcome before early reading instruction is universally of sufficient quality to ensure that we really are a literate society – i.e. one in which all school leavers have good, not just functional or non-functional, reading and writing skills. But – does phonics help or hinder comprehension? Is it merely, as Michael Rosen and his followers have characterised it, “barking at print”? It seems to me that this question is at…

View original post 1,789 more words

Leave a comment

Filed under Education

Interesting: famous Milgram-experiment has been successfully replicated

It has been a topic that has been fascinating me for quite a while now: are the insights from the famous Milgram-experiment valid or not. Why I have been questioning this, is because there has been criticism lately:

…several scholars raised new criticisms of the research based on their analysis of the transcripts and audio from the original experiments, or on new simulations or partial replications of the experiments. These contemporary criticisms add to past critiques, profoundly undermining the credibility of the original research and the way it is usually interpreted. That Milgram’s studies had a mighty cultural and scholarly impact is not in dispute; the meaning of what he found most certainly is.

BPS Digest sums up the most important modern criticisms:

  • When a participant hesitated in applying electric shocks, the actor playing the role of experimenter was meant to stick to a script of four escalating verbal “prods”. In fact, he frequently improvised, inventing his own terms and means of persuasion. Gina Perry (author of Behind The Shock Machine) has said the experiment was more akin to an investigation of “bullying and coercion” than obedience.
  • A partial replication of the studies found that no participants actually gave in to the fourth and final prod, the only one that actually constituted a command. Analysis of Milgram’s transcripts similarly suggested that the experimenter prompts that were most like a command were rarely obeyed. A modern analogue of Milgram’s paradigm found that order-like prompts were ineffective compared with appeals to science, supporting the idea that people are not blindly obedient to authority but believe they are contributing to a worthy cause.
  • Milgram failed to fully debrief his participants immediately after they’d participated.
  • In an unpublished version of his paradigm, Milgram recruited pairs of people who knew each other to play the role of teacher and learner. In this case, disobedience rose to 85 per cent.
  • Many participants were sceptical about the reality of the supposed set-up. Restricting analysis to only those who truly believed the situation was real, disobedience rose to around 66 per cent.

But now there is a new – successful but again partial- replication of the famous experiment, the research appears in the journal Social Psychological and Personality Science. Still, one could ask again if some of the criticisms aren’t still valid also for the new replication – they do imho.

From the press release:

“Our objective was to examine how high a level of obedience we would encounter among residents of Poland,” write the authors. “It should be emphasized that tests in the Milgram paradigm have never been conducted in Central Europe. The unique history of the countries in the region made the issue of obedience towards authority seem exceptionally interesting to us.”

For those unfamiliar with the Milgram experiment, it tested people’s willingness to deliverer electric shocks to another person when encouraged by an experimenter. While no shocks were actually delivered in any of the experiments, the participants believed them to be real. The Milgram experiments demonstrated that under certain conditions of pressure from authority, people are willing to carry out commands even when it may harm someone else.

“Upon learning about Milgram’s experiments, a vast majority of people claim that ‘I would never behave in such a manner,’ says Tomasz Grzyb, a social psychologist involved in the research. “Our study has, yet again, illustrated the tremendous power of the situation the subjects are confronted with and how easily they can agree to things which they find unpleasant.”

While ethical considerations prevented a full replication of the experiments, researchers created a similar set-up with lower “shock” levels to test the level of obedience of participants.

The researchers recruited 80 participants (40 men and 40 women), with an age range from 18 to 69, for the study. Participants had up to 10 buttons to press, each a higher “shock” level. The results show that the level of participants’ obedience towards instructions is similarly high to that of the original Milgram studies.

They found that 90% of the people were willing to go to the highest level in the experiment. In terms of differences between peoples willingness to deliver shock to a man versus a woman, “It is worth remarking,” write the authors, “that although the number of people refusing to carry out the commands of the experimenter was three times greater when the student [the person receiving the “shock”] was a woman, the small sample size does not allow us to draw strong conclusions.”

In terms of how society has changed, Grzyb notes, “half a century after Milgram’s original research into obedience to authority, a striking majority of subjects are still willing to electrocute a helpless individual.”

Abstract of the study:

In spite of the over 50 years which have passed since the original experiments conducted by Stanley Milgram on obedience, these experiments are still considered a turning point in our thinking about the role of the situation in human behavior. While ethical considerations prevent a full replication of the experiments from being prepared, a certain picture of the level of obedience of participants can be drawn using the procedure proposed by Burger. In our experiment, we have expanded it by controlling for the sex of participants and of the learner. The results achieved show a level of participants’ obedience toward instructions similarly high to that of the original Milgram studies. Results regarding the influence of the sex of participants and of the “learner,” as well as of personality characteristics, do not allow us to unequivocally accept or reject the hypotheses offered.

1 Comment

Filed under Psychology, Research

How to get kids to eat more healthy in school?

When I went to school, there were still nuns who cooked dinner for us and I remember one nun in particular who could look at you in way you sure would eat all of your vegetables. Well, that was one way of making sure we all ate more healthy at school. 1Brigham Young University researcher is trying to nail down how to get kids more salad without that special look.

From the press release:

Thanks to a national initiative, salad bars are showing up in public schools across the country. Now a Brigham Young University researcher is trying to nail down how to get kids to eat from them.

BYU health sciences professor Lori Spruance studies the impact of salad bars in public schools and has found one helpful tip: teens are more likely to use salad bars if they’re exposed to good, old-fashioned marketing. Students at schools with higher salad bar marketing are nearly three times as likely to use them.

“Children and adolescents in the United States do not consume the nationally recommended levels of fruits and vegetables,” Spruance said. “Evidence suggests that salad bars in schools can make a big difference. Our goal is to get kids to use them.”

Some 4,800 salad bars have popped up in public schools around the country according to the Let’s Move Salad Bars to Schools initiative. About 50 percent of high school students have access to salad bars at schools, 39 percent of middle school kids and 31 percent of elementary school children.

Spruance’s study, published in Health Education and Behavior, followed the salad bar usage of students in 12 public schools in New Orleans. Spruance and coauthors from Tulane University administered surveys to the students and tracked the school environment through personal visits.

Not only did they find better marketing improved salad bar usage among secondary school students, but they also found female students use salad bars more often than male students, and children who prefer healthy foods use them more frequently.

“The value of a salad bar program depends on whether students actually use the salad bar,” Spruance said. “But few studies have examined how to make that happen more effectively.”

Some examples of successful salad bar marketing efforts included signage throughout the school promoting the salad bar, information in school publications and newsletters, and plugs for the salad bar on a school’s digital presence.

Spruance suggests that schools engage parents in their efforts to improve the school food environment–such as reaching out to parents through newsletters or parent teacher conferences. Of course, she says, offering healthy options at home makes the biggest difference.

“It takes a lot of effort and time, but most children and adolescents require repeated exposures to food before they will eat them on their own,” Spruance said. “If a child is being exposed to foods at home that are served at school, the child may be more likely to eat those fruits or vegetables at school.”

Spruance’s research builds off of previous studies that show students are more likely to use salad bars if they are included in the normal serving line.

Guess the nun and that special look is maybe still the best option…

Abstract of the study:

Background. Consumption levels of fruits and vegetables (F/V) among children/adolescents are low. Programs like school-based salad bars (SB) provide children/adolescents increased F/V access.

Aims. The purpose of this study was to examine the relationship between SB use and individual and school-level factors among elementary and secondary school students in New Orleans public schools.

Method. Twelve schools receiving SB units from the Let’s Move Salad Bars to Schools Campaign participated in this study. Self-reported data were collected from students (n = 1,012), administrators (n = 12), and food service staff (n = 37). School environmental data were obtained through direct observation. Generalized estimating equation regression methods were used to develop a multilevel model including both school-level (e.g., length of lunchtime, SB marketing, vending machines) and individual-level (e.g., sex, food preferences, nutrition knowledge) effects.

Results. Female students had higher odds of using the SB compared to males. Students with healthier food preferences had higher odds of using the SB than those who reported less healthy food preferences. Within the multilevel model for all students, only sex and healthy food preferences remained significant. In a multilevel model assessing secondary students only, student encouragement toward others for healthy eating and school-based SB marketing were significantly related to SB use.

Conclusions. Little research has examined factors related to school-based SB use. These findings suggest recommendations that may help improve student use of SBs. For example, increasing the promotion of SB, particularly in secondary schools, might encourage their use among students.

1 Comment

Filed under Education, Research