Category Archives: Technology

Will the effect last? Math on a tablet helps low-performing second graders… for a while

There is a new Best Evidence in Brief with among others, this study that shows how important it can be to check the lasting effects of what you do.

Published in the Journal of Educational Psychology, Martin Hassler and colleagues carried out a randomized controlled trial of a mathematics intervention on tablets (iPads).
The trial involved 283 low-performing second graders spread across 27 urban schools in Sweden. The children were randomized to four groups:
  • A math intervention called Chasing Planets, consisting of 261 planets on a space map, each with a unique math exercise (addition or subtraction up to 12). Students practiced for 20 minutes a day.
  • The math intervention combined with working memory training, where students spent an additional 10 minutes each day on working memory tasks.
  • A placebo group who practiced mostly reading tasks on the tablet (again for 20 minutes each day), including Chasing Planets-Reading, which had a similar format to the math intervention.
  • A control group who received no intervention, not even on improving their skills on the tablets.
The intervention lasted for around 20 weeks, with children completing nine measures at pre- and post-test, and then after 6 and 12 months.
Both math conditions scored significantly higher (effect size = +0.53-0.67) than the control and placebo groups on the post-test of basic arithmetic, but not on measures of arithmetic transfer or problem solving. There was no additional benefit of the working memory training. The effects faded at the 6-month follow-up (effect size = +0.18-0.28) and even more so after 12 months (effect size = +0.03-0.13)
IQ was a significant moderator of direct and long-term effects, such that children with lower IQ benefited more than higher IQ students. Socioeconomic factors did not moderate outcomes.

1 Comment

Filed under Education, Research, Technology

Good, short video on why algorithms aren’t objective at all: “an algorithm is an opinion embedded in math”

1 Comment

Filed under Media literacy, Technology

New meta-analysis begs: Don’t throw away your printed books in education

I just found a new meta-analysis soon to be published in which Pablo Delgado, Cristina Vargas, Rakefet Ackerman & Ladislao Salmerón examine the effects of reading media on reading comprehension. Well, the title gives away the conclusion, I guess.

But this is the longer version:

The results of the two meta-analyses in the present study yield a clear picture of screen inferiority, with lower reading comprehension outcomes for digital texts compared to printed texts, which corroborates and extends previous research (Kong et al., 2018; Singer & Alexander, 2017b; Wang et al. 2007). These results were consistent across methodologies and theoretical frameworks.

And while the effects are relatively low, the researchers do warn:

Although the effect sizes found for media (-.21) are small according to Cohen’s guidelines (1988), it is important to interpret this effect size in the context of reading comprehension studies. During elementary school, it is estimated that yearly growth in reading comprehension is .32 (ranging from .55 in grade 1, to .08 in grade 6) (Luyten, Merrel & Tymms, 2017). Intervention studies on reading comprehension yield a mean effect of .45 (Scammacca et al., 2015). Thus, the effects of media are relevant in the educational context because they represent approximately 2/3 of the yearly growth in comprehension in elementary school, and 1/2 of the effect of remedial interventions.

The analysis also has some clear practical consequences:

A relevant moderator found for the screen inferiority effect was time frame. This finding sheds new light on the mixed results in the existing literature. Consistent with the findings by Ackerman and Lauterman (2012) with lengthy texts, mentioned above, Sidi et al. (2017) found that even when performing tasks involving reading only brief texts and no scrolling (solving challenging logic problems presented in an average of 77 words), digital-based environments harm performance under time pressure conditions, but not under a loose time frame. In addition, they found a similar screen inferiority when solving problems under time pressure and under free time allocation, but framing the task as preliminary rather than central. Thus, the harmful effect of limited time on digital-based work is not limited to reading lengthy texts. Moreover, consistently across studies, Ackerman and colleagues found that people suffer from greater overconfidence in digital-based reading than in paper-based reading under these conditions that warrant shallow processing.

Our findings call to extend existing theories about self-regulated learning (see Boekaerts, 2017, for a review). Effects of time frames on self-regulated learning have been discussed from various theoretical approaches. First, a metacognitive explanation suggests that time pressure encourages compromise in reaching learning objectives (Thiede & Dunlosky, 1999). Second, time pressure has been associated with cognitive load. Some studies found that time pressure increased cognitive load and harmed performance (Barrouillet, Bernardin, Portrat, Vergauwe, & Camos, 2007). However, others suggested that it can generate a germane (“good”) cognitive load by increasing task engagement (Gerjets & Scheiter, 2003). In these theoretical discussions, the potential effect of the medium in which the study is conducted has been overlooked. We see the robust finding in the present meta-analyses about the interaction between the time frame and the medium as a call to theorists to integrate the processing style adapted by learners in specific study environments into their theories

What I really appreciate is that the researchers also checked for publication bias, and good news, the different indicators that were used, suggested no risk of publication bias.

There is only a small bit of irony… I read the study online and you read this online too.

Abstract of the meta-analysis:

With the increasing dominance of digital reading over paper reading, gaining understanding of the effects of the medium on reading comprehension has become critical. However, results from research comparing learning outcomes across printed and digital media are mixed, making conclusions difficult to reach. In the current metaanalysis, we examined research in recent years (2000-2017), comparing the reading of comparable texts on paper and on digital devices. We included studies with betweenparticipant (n = 38) and within-participant designs (n = 16) involving 171,055 participants. Both designs yielded the same advantage of paper over digital reading (Hedge’s g = -.21; dc = -.21). Analyses revealed three significant moderators: (1) time frame: the paper-based reading advantage increased in time-constrained reading compared to self-paced reading; (2) text genre: the paper-based reading advantage was consistent across studies using informational texts, or a mix of informational and narrative texts, but not on those using only narrative texts; (3) publication year: the advantage of paper-based reading increased over the years. Theoretical and educational implications are discussed.

Leave a comment

Filed under Education, Research, Review, Technology

Oh, I missed the new Hype Cycle for education. Well…

It’s a yearly tradition for Gartner to publish a string of hype cycles, including one for education in July. And I admit: I didn’t pay attention to it.

So, there is a new one, but besides the many issues one can have with the hype cycle by this company, I do think this time it’s pretty bland as if everybody with a bit of knowledge about EdTech could have written it.

  • On the Rise
    • AV Over IP in Education
    • Social CRM: Education
    • Li-Fi
    • Emotion AI
    • Virtual Reality/Augmented Reality Applications in Education
  • At the Peak
    • Blockchain in Education
    • Artificial Intelligence Education Applications
    • Design Thinking
    • Exostructure Strategy
    • Classroom 3D Printing
    • Digital Assessment
    • SaaS SIS
  • Sliding Into the Trough
    • Education Analytics
    • Competency-Based Education Platforms
    • Bluetooth Beacons
    • Semantic Knowledge Graphing
    • Citizen Developers
    • Digital Credentials
    • Alumni CRM
    • Master Data Management
    • Adaptive Learning Platforms
  • Climbing the Slope
    • Student Retention CRM
    • IDaaS
    • Enterprise Video Content Management
  • Entering the Plateau
    • Integration Brokerage


Filed under Education, Technology, Trends

Do read this great little tweet tirade on #edtech predicting the future and cognitive science by Benjamin Riley (Deans for Impact)

When I read the first tweet of this thread by Benjamin Riley I had the feeling we were up to something good. And Benjamin didn’t disappoint. I won’t make it into a habit of posting something like this on this blog, but I do wanted to share this here as I know that many of my readers would otherwise miss this:

And thus the conclusion?


Filed under Education, Technology

What works and doesn’t work with instructional video, a new short overview

There is a special issue of Computers in Human Behavior on learning from video and in their Editorial Fiorella and Mayer give an overview of effective and ineffective methods that are being trialed in the special issue:

What are the effective methods?

…two techniques that appear to improve learning outcomes with instructional video are segmenting—breaking the video into parts and allowing students to control the pace of the presentation—and mixed perspective—filming from both a first-person perspective and third-person.

And what isn’t worth the effort?

…some features that do not appear to be associated with improved learning outcomes with instructional video are matching the gender of the instructor to the gender of the learner, having the instructor’s face on the screen, inserting pauses throughout the video, and adding practice without feedback.

Abstract of the editorial:

In this commentary, we examine the papers in a special issue on “Developments and Trends in Learning with Instructional Video”. In particular, we focus on basic findings concerning which instructional features improve learning with instructional video (i.e., breaking the lesson into segments paced by the learner; recording from both first- and third-person perspectives) and which features or learner attributes do not (i.e., matching the instructor’s gender to the learner’s gender; having the instructor’s face on the screen; adding practice without feedback; inserting pauses throughout the video; and spatial ability). In addition, we offer recommendations for future work on designing effective video lessons.

Leave a comment

Filed under Education, Research, Review, Technology

Again: the bad effect of checking your phone in class

There has been quite some debate because of the French decision to ban mobile phones from schools from this school year on. While I’m also a bit critical – I think we should rather teach children how to deal with phones and focus – I do understand where this thinking stems from, from studies like this: the negative correlation between mobile phone use and grades.

From the press release:

Students perform less well in end-of-term exams if they are allowed access to an electronic device, such as a phone or tablet, for non-academic purposes in lectures, a new study in Educational Psychology finds.

Students who don’t use such devices themselves but attend lectures where their use is permitted also do worse, suggesting that phone/tablet use damages the group learning environment.

Researchers from Rutgers University in the US performed an in-class experiment to test whether dividing attention between electronic devices and the lecturer during the class affected students’ performance in within-lecture tests and an end-of-term exam.

118 cognitive psychology students at Rutgers University participated in the experiment during one term of their course. Laptops, phones and tablets were banned in half of the lectures and permitted in the other half. When devices were allowed, students were asked to record whether they had used them for non-academic purposes during the lecture.

The study found that having a device didn’t lower students’ scores in comprehension tests within lectures, but it did lower scores in the end-of-term exam by at least 5%, or half a grade. This finding shows for the first time that the main effect of divided attention in the classroom is on long-term retention, with fewer targets of a study task later remembered.

In addition, when the use of electronic devices was allowed in class, performance was also poorer for students who did not use devices as well as for those who did.

The study’s lead author, Professor Arnold Glass, added: “These findings should alert the many dedicated students and instructors that dividing attention is having an insidious effect that is impairing their exam performance and final grade.

“To help manage the use of devices in the classroom, teachers should explain to students the damaging effect of distractions on retention – not only for themselves, but for the whole class.”

Abstract of the study:

The intrusion of internet-enabled electronic devices (laptop, tablet, and cell phone) has transformed the modern college lecture into a divided attention task. This study measured the effect of using an electronic device for a non-academic purpose during class on subsequent exam performance. In a two-section college course, electronic devices were permitted in half the lectures, so the effect of the devices was assessed in a within-student, within-item counterbalanced experimental design. Dividing attention between an electronic device and the classroom lecture did not reduce comprehension of the lecture, as measured by within-class quiz questions. Instead, divided attention reduced long-term retention of the classroom lecture, which impaired subsequent unit exam and final exam performance. Students self-reported whether they had used an electronic device in each class. Exam performance was significantly worse than the no-device control condition both for students who did and did not use electronic devices during that class.

Leave a comment

Filed under Education, Research, Social Media, Technology

Don’t multitask during class (again)

Yesterday I discovered this small randomized controlled trial via a tweet by Daniel Willingham. The study confirms what we’ve seen in different other studies: multitasking in class is bad for learning.

The results showed that when students were given the opportunity of non-lecture-related multi- tasking using mobile phones writing/sending SMSs and looking at Facebook profiles/reading news feed/looking at shared multimedia/reading wall messages during the lecture, their grade performance was hindered compared to traditional pen and paper note-taking.

Although I have to correct myself, one should better say: not multitasking is better for learning

Although there was a significant difference between participants on the traditional pen and paper note-taking lectures (no technology multitasking) and social media and SMS multitasking groups in terms of academic achievement, students in multitasking with social media and SMS groups also improved their pretest results.

As said, the study isn’t that big with 122 participants spread over 3 groups, but adds to the existing body of knowledge.

Abstract of the study

The purpose of this study is to investigate whether off-task multitasking activities with mobile technologies, specifically social networking sites and short messaging services, used during real-time lectures have an effect on grade performance in higher education students. Two experimental groups and one control group were used in this research. While participants in experimental groups 1 and 2 were allowed to navigate Facebook and to exchange short messaging service messages via mobile phones during real time in class lecturing, the control group participants were allowed to take notes using only pen and paper in the same lecturing conditions during three consecutive experimental sessions. The results showed that when students were given the opportunity of non-lecture-related multitasking using mobile phones writing/sending short messaging services and looking at Facebook profiles/reading news feed/looking at shared multimedia/reading wall messages during the lecture, their grade performance was hindered compared to traditional pen and paper note-taking. Engaging in social media use while trying to follow instruction may reduce learners’ capacity for cognitive processing causing poor academic performance.

1 Comment

Filed under Education, Research, Technology

New study in Nature: playing a lof of violent games doesn’t make players more violent

It’s a very popular idea, dating back to the theories and studies by Bandura: seeing violence teaches people to act violently. And more recently there was an American president linking computer games to school shootings. This new study shows that this maybe unwarranted.

In this study by Kühn et al published in Nature the researchers did a randomized controlled trial with 3 groups

  • 1 group who played Grand Theft Auto intensively during 2 months
  • 1 group who played The Sims 3 intensively during 2 months
  • 1 group who didn’t play games at all.

And what did the researchers find?

Within the scope of the present study we tested the potential effects of playing the violent video game GTA V for 2 months against an active control group that played the non-violent, rather pro-social life simulation game The Sims 3 and a passive control group. Participants were tested before and after the long-term intervention and at a follow-up appointment 2 months later. Although we used a comprehensive test battery consisting of questionnaires and computerised behavioural tests assessing aggression, impulsivity-related constructs, mood, anxiety, empathy, interpersonal competencies and executive control functions, we did not find relevant negative effects in response to violent video game playing. In fact, only three tests of the 208 statistical tests performed showed a significant interaction pattern that would be in line with this hypothesis. Since at least ten significant effects would be expected purely by chance, we conclude that there were no detrimental effects of violent video gameplay.

Will this study end all discussions? No, I’m sure this won’t be the case even if this study is very relevant. It’s worth noticing that the average age of the participants was 28, on which I would suggest that a replication with younger participants would be a very good idea.

Abstract of the study:

It is a widespread concern that violent video games promote aggression, reduce pro-social behaviour, increase impulsivity and interfere with cognition as well as mood in its players. Previous experimental studies have focussed on short-term effects of violent video gameplay on aggression, yet there are reasons to believe that these effects are mostly the result of priming. In contrast, the present study is the first to investigate the effects of long-term violent video gameplay using a large battery of tests spanning questionnaires, behavioural measures of aggression, sexist attitudes, empathy and interpersonal competencies, impulsivity-related constructs (such as sensation seeking, boredom proneness, risk taking, delay discounting), mental health (depressivity, anxiety) as well as executive control functions, before and after 2 months of gameplay. Our participants played the violent video game Grand Theft Auto V, the non-violent video game The Sims 3 or no game at all for 2 months on a daily basis. No significant changes were observed, neither when comparing the group playing a violent video game to a group playing a non-violent game, nor to a passive control group. Also, no effects were observed between baseline and posttest directly after the intervention, nor between baseline and a follow-up assessment 2 months after the intervention period had ended. The present results thus provide strong evidence against the frequently debated negative effects of playing violent video games in adults and will therefore help to communicate a more realistic scientific perspective on the effects of violent video gaming.


Filed under Media literacy, Psychology, Research, Technology

What are the effects of giving each child a laptop?

This study I found via Gabriel Bouchaud examines the possible effects of the One Laptop per Child in Peru. Other than my normal procedure I want to start with the abstract as it summarizes a lot already:

This paper presents results from a large-scale randomized evalua- tion of the One Laptop per Child program, using data collected after 15 months of implementation in 318 primary schools in rural Peru. The program increased the ratio of computers per student from 0.12 to 1.18 in treatment schools. This expansion in access translated into substantial increases in use of computers both at school and at home. No evidence is found of effects on test scores in math and language. There is some evidence, though inconclusive, about positive effects on general cognitive skills.

This doesn’t sound that bad. The pupils use computers more – what a suprise if they didn’t own a computer before – but does it have an effect on education?

Well? The computers were packed with over 200 books, but… the pupils didn’t start to read more. They didn’t spent more time on education. And… they didn’t really seem to do better in class.

Or as the researchers summarize:

In general, we do not find conclusive evidence indicating clear changes in behavior in these dimensions. Regarding study time at home, we document some positive effects on whether the student studied at home the prior day. Nonetheless, results indicate small effects on whether the student studied one or more hours daily the prior week. In terms of reading, results suggest some negative effects on whether the student read a book the prior day but small effects on whether the student read a book the prior week.20 Overall, we find no statistically significant effects on the specific outcomes analyzed or on the learn- ing behavior summary measure.

…did increased computer access affect academic and cognitive skills? Table 9 shows that there are no statistically signi – cant effects on the academic achievement summary measure when focusing on all students and also when restricting to those in the interviewed sample (columns 1 and 2, respectively). Small standard errors allow ruling out modest effects. Also, there are no statistically signi cant effects on either math or language achievement for both samples.

Our results suggest that computers by them- selves, at least as initially delivered by the OLPC program, do not increase achieve- ment in curricular areas.

I’m a bit in a bind here. I do not want to state that it is a bad idea to give computers to children in need. But this study does show that just giving kids a computer – or putting them in a wall – doesn’t do much.


1 Comment

Filed under Education, Research, Technology