Found this summary here.
I’m taking a bit of leap here, as my colleagues in Leiden are much bigger experts on science communication than I am. I mean they do research on science communication while I just communicate about science. Still I thought this study to be pretty interesting and relevant. The study shows that building relationships between scientists and communities that are founded on shared values, does work.
An excerpt from the press release:
Bring science to people where they are. That’s the driving philosophy that propels U biology professor Nalini Nadkarni to stretch the possibilities of science communication and bring the beauty of science to people and places that others have overlooked.
Building public trust in science is about more than just providing information and improving science literacy, she says. It’s about building relationships between scientists and communities that are founded on shared values. It’s called the “Ambassador Model”, and Nadkarni now has the data to say that the approach works, at relatively low cost and with high effectiveness.
In two recent studies, one published today in BioScience and another published in 2018 in Science Communication, Nadkarni and her colleagues present evidence-based conclusions about the effectiveness of science engagement in two programs: The INSPIRE program, which brings science lectures to prisons, and the STEM Ambassador Program, which trains scientists to engage members of the public in discussions about science.
“Our goal is to help people realize that all citizens are interested in, capable of understanding and full of wonder at science, if it is presented in places and ways that are accessible to them,” Nadkarni says.
*DO NOTE: I was only able to read the 2018 study as the 2019 isn’t accessible yet.
This study has more to it than the fun fact in the title of this post. But it does seem to be something you’d share in a conversation, that
…from infancy to young adulthood, learners absorb approximately 12.5 million bits of information about language — about two bits per minute — to fully acquire linguistic knowledge. If converted into binary code, the data would fill a 1.5 MB floppy disk.
For the people who don’t know what a floppy disk was, it’s the icon in Word to save something and it’s a predecessor of the cloud, but than stored locally. And small, well the storage room on the disk.
And now the more important insight from the study taken from the press release:
The findings, published today in the Royal Society Open Science journal, challenge assumptions that human language acquisition happens effortlessly, and that robots would have an easy time mastering it.
“Ours is the first study to put a number on the amount you have to learn to acquire language,” said study senior author Steven Piantadosi, an assistant professor of psychology at UC Berkeley. “It highlights that children and teens are remarkable learners, absorbing upwards of 1,000 bits of information each day.”
For example, when presented with the word “turkey,” a young learner typically gathers bits of information by asking, “Is a turkey a bird? Yes, or no? Does a turkey fly? Yes, or no?” and so on, until grasping the full meaning of the word “turkey.”
A bit, or binary digit, is a basic unit of data in computing, and computers store information and calculate using only zeroes and ones. The study uses the standard definition of eight bits to a byte.
“When you think about a child having to remember millions of zeroes and ones (in language), that says they must have really pretty impressive learning mechanisms.”
Piantadosi and study lead author Frank Mollica, a Ph.D. candidate in cognitive science at the University of Rochester, sought to gauge the amounts and different kinds of information that English speakers need to learn their native language.
They arrived at their results by running various calculations about language semantics and syntax through computational models. Notably, the study found that linguistic knowledge focuses mostly on the meaning of words, as opposed to the grammar of language.
“A lot of research on language learning focuses on syntax, like word order,” Piantadosi said. “But our study shows that syntax represents just a tiny piece of language learning, and that the main difficulty has got to be in learning what so many words mean.”
That focus on semantics versus syntax distinguishes humans from robots, including voice-controlled digital helpers such as Alexa, Siri and Google Assistant.
“This really highlights a difference between machine learners and human learners,” Piantadosi said. “Machines know what words go together and where they go in sentences, but know very little about the meaning of words.”
As for the question of whether bilingual people must store twice as many bits of information, Piantadosi said this is unlikely in the case of word meanings, many of which are shared across languages.
“The meanings of many common nouns like ‘mother’ will be similar across languages, and so you won’t need to learn all of the bits of information about their meanings twice,” he said.
Abstract of the study:
We introduce theory-neutral estimates of the amount of information learners possess about how language works. We provide estimates at several levels of linguistic analysis: phonemes, wordforms, lexical semantics, word frequency and syntax. Our best guess is that the average English-speaking adult has learned 12.5 million bits of information, the majority of which is lexical semantics. Interestingly, very little of this information is syntactic, even in our upper bound analyses. Generally, our results suggest that learners possess remarkable inferential mechanisms capable of extracting, on average, nearly 2000 bits of information about how language works each day for 18 years.
Both Casper and myself are a bit of trekkies, we have to admit. So this study is a bit different than the typical research I discuss on this blog. But do forgive me, there is a relation with education: elementary mathematics brings Star Treks Holodeck closer to reality
From the press release:
For many years we have been hearing that holographic technology is one step closer to realizing Star Trek’s famous Holodeck, a virtual reality stage that simulates any object in 3D as if they are real. Sadly, 3D holographic projection has never been realized. A team of scientists from Bilkent University, Turkey, again raised our hopes on Holodeck realization by showing the first realistic 3D holograms that can be viewed from any angle. Dr. Ghaith Makey, the lead author of the paper appearing in the April 2019 issue of Nature Photonics today, says “Our technique can work in realtime, and will surely pave the way to dynamic 3D video holography. Soon, it may be possible to create a simple version of a Holodeck”.
3D holographic projection relies on back-to-back stacking of a large number of 2D images. The problem is, they cross-talk! Interference between the images makes the 3D projection fuzzy and far from realistic looking. The expectations have always been placed on the development of optical technology. Prof. F. Ömer Ilday, the co-leader of the project, disagrees: “The reason of this cross-talk is the mathematics, not shortcomings of the physical components. Any pair of high-dimensional mutually random vectors tend to be orthogonal. This basic result is a consequence of the Central Limit Theorem and the Law of Large Numbers. We use this property, together with a neat, but straightforward wavefront engineering trick to add random phase to each image, to eliminate cross-talk without using any additional optics”. Prof. Onur Tokel, the other co-leader, adds “It was not possible to simultaneously project a 3D object’s back, middle and front parts. We solve this issue through a simple connection between the equations developed by Jean-Baptiste Joseph Fourier and Augustin-Jean Fresnel in the early days of the field. Using this math property, we advance the state-of-the-art from the projection of 3-4 images to 1000 simultaneous projections!”
“The best part is that 3D projection performance will become better and better with increasing hologram resolution because the orthogonality result becomes exact for infinite dimensions — as display technologies continue to improve and support higher-resolution, they enable ever more realistic looking holograms using our technique,” says Dr. Makey. The team believes that the full potential of holography may be unleashed if such a 3D capability is at hand. “Our holograms already surpass all previous digitally synthesized 3D holograms in every quality metric. Our method is universally applicable to all types of holographic media, be they static or dynamic holograms. Therefore, opportunities are vast. Immediate applications may be in 3D displays for medical visualization or air traffic control, but also laser-material interactions and microscopy” says Prof. Serim Ilday of the Bilkent team.
“The most important concept associated with holography has always been the third dimension. This is even more clear with our new 3D projection capability. Many challenges remain, but we are one step closer to the visions defined by Holodeck in Star Trek; or Holovision of Isaac Asimov in the Foundation novels. Even Jules Verne touched upon the idea, in his book Carpathian Castle published in 1892,” adds Dr. Tokel.
Abstract of the paper, published in Nature photonics:
Holography is the most promising route to true-to-life three-dimensional (3D) projections, but the incorporation of complex images with full depth control remains elusive. Digitally synthesized holograms, which do not require real objects to create a hologram, offer the possibility of dynamic projection of 3D video. Despite extensive efforts aimed at 3D holographic projection, however, the available methods remain limited to creating images on a few planes, over a narrow depth of field or with low resolution. Truly 3D holography also requires full depth control and dynamic projection capabilities, which are hampered by high crosstalk. The fundamental difficulty is in storing all the information necessary to depict a complex 3D image in the 2D form of a hologram without letting projections at different depths contaminate each other. Here, we solve this problem by pre-shaping the wavefronts to locally reduce Fresnel diffraction to Fourier holography, which allows the inclusion of random phase for each depth without altering the image projection at that particular depth, but eliminates crosstalk due to the near-orthogonality of large-dimensional random vectors. We demonstrate Fresnel holograms that form on-axis with full depth control without any crosstalk, producing large-volume, high-density, dynamic 3D projections with 1,000 image planes simultaneously, improving the state of the art for the number of simultaneously created planes by two orders of magnitude. Although our proof-of-principle experiments use spatial light modulators, our solution is applicable to all types of holographic media.
Found this cartoon via Larry Cuban who collected more cartoons on leadership.
Doing research in education is difficult, very difficult. Trust me. So while this review may come as a shock, I’m not that shocked.
I’m performing a RCT myself at the moment and doing it in real life situations is proving to be very hard.
From the press release:
Educational trials aimed at boosting academic achievement in schools are often uninformative, new research suggests.
The new study, published in the journal Educational Researcher, found that 40% of large-scale randomised controlled trials (RCTs) in the UK and the US failed to produce any evidence as to whether an educational intervention helped to boost academic attainment or not.
The researchers evaluated 141 trials involving more than one million students, which tested schemes ranging from whether providing free school breakfasts raises grades in Maths and English, to whether playing chess improves numeracy skills.
The trials, which were carried out by the charitable organisation the Education Endowment Foundation (EEF) in the UK and the National Center for Education Evaluation and Regional Assistance (NCEE) in the US, are expensive – with costs often exceeding £500,000.
The authors of the study argue that more research is urgently needed to understand why RCTs in education are so often uninformative.
Lead author of the research, Dr Hugues Lortie-Forgues, from the Department of Education at the University of York, UK, said: “Just like in medicine, trials of educational interventions are an important way to allow policy makers and teachers to make informed decisions about how to improve education. However, many of these trials are currently not fulfilling their main aim of demonstrating which interventions are effective and which are not.”
“Further research to investigate the reasons for this should be a priority. These organisations are trying to achieve something positive and reform is urgently needed to help them to do so.”
In recent years there have been a growing number of RCTs conducted in education. For example, in the UK, the Education Endowment Foundation (EEF) has commissioned more than 191 trials since 2012.
The researchers cite possible reasons why current trials may be ineffective, including:
- The interventions being tested may not be suitable for trial in the first place.
- Interventions may not be being correctly implemented during trials – for example due to inadequate training of teachers in the methods being tested.
- The trials themselves may be poorly designed
The authors suggest a series of changes that could make the trials more informative, including higher-standards when considering which new initiatives are trialled.
Rigorous Large-Scale Educational RCTs are Often Uninformative: Should We Be Concerned? Is published in the journal Educational Researcher. The study was carried out in collaboration with researchers at Loughborough University.
Abstract of the review:
There are a growing number of large-scale educational randomized controlled trials (RCTs). Considering their expense, it is important to reflect on the effectiveness of this approach. We assessed the magnitude and precision of effects found in those large-scale RCTs commissioned by the UK-based Education Endowment Foundation and the U.S.-based National Center for Educational Evaluation and Regional Assistance, which evaluated interventions aimed at improving academic achievement in K–12 (141 RCTs; 1,222,024 students). The mean effect size was 0.06 standard deviations. These sat within relatively large confidence intervals (mean width = 0.30 SDs), which meant that the results were often uninformative (the median Bayes factor was 0.56). We argue that our field needs, as a priority, to understand why educational RCTs often find small and uninformative effects.
There is a new Best Evidence in Brief with among others, this study:
An article published in Educational Research Review examines the effects of self-assessment on self-regulated learning (SRL) and self-efficacy in four meta-analyses.To understand the impact of students’ assessment of their own work, Ernesto Panadero and colleagues from Spain analyzed 19 studies comprised of 2,305 students from primary schools to higher education. The meta-analyses only included studies published in English that contained empirical results of self-assessment interventions in relation to SRL and/or self-efficacy, had at least one control group, and had been peer-reviewed.The findings indicated that:
- Self-assessment had a positive effect on SRL strategies serving a positive self-regulatory function for students’ learning, such as meta-cognitive strategies (ES= +0.23)
- Self-assessment had a negative effect on “Negative SRL,” which is associated with negative emotions and stress and is thought to be adverse to students’ learning (ES =-0.65)
- Self-assessment was also positively associated with SRL even when SRL was measured qualitatively (ES= +0.43)
- Self-assessment had a positive effect on self-efficacy (ES= +0.73), the effect being larger for girlsThe authors suggest that self-assessment is necessary for productive learning but note that the results have yet to identify the most effective self-assessment components (e.g., monitoring, feedback, and revision) in fostering SRL strategies or self-efficacy.
We discuss Growth Mindset also in our new Myths book out later this year, but Carl has done a great job.
This appeared in Aeon, March 11, 2019
“Dr Carl Hendrick is the co-author of What Does This Look Like In The Classroom: Bridging The Gap Between Research And Practice (2017). He holds a PhD in education from King’s College and lives in Berkshire, England where he teaches at Wellington College. He is currently writing a book with Professor Paul Kirschner on foundational works in education research.”
Over the past century, a powerful idea has taken root in the educational landscape. The notion of intelligence as something innate and fixed has been supplanted by the idea that intelligence is instead something malleable; that we are not prisoners of immutable characteristics and that, with the right training, we can be the authors of our own cognitive capabilities.
Nineteenth-century scientists including Francis Galton and Alfred Binet devoted their own considerable intelligence to a quest to classify and understand human cognitive ability. If…
View original post 3,269 more words
This is an American study, but I do recognize elements also that I saw in other countries.
From the press release:
Perceptions about the social mix and environment of local mainstream schools motivate parents to choose Free Schools for their children, a new study published in the Cambridge Journal of Education finds. A ‘traditional’ approach to education and smaller class sizes also make such schools more appealing to parents.
Dr. Rebecca Morris of the University of Warwick and Dr. Thomas Perry of the University of Birmingham surveyed 346 Free School and non-Free School parents of Year 7 children, then conducted 20 follow-up interviews with Free School parents. The data was collected in 2013-2014, three years after the introduction of the English Free Schools policy that allowed the establishment of new autonomous schools, funded by the state but proposed, developed and run by external sponsors.
The researchers found that academic quality and school performance were the central focus for both Free School and non-Free School parents in choosing their child’s school. However, as the newly-opened Free Schools had no ‘hard’ performance data or inspection reports available at the time, the study highlighted how parents used proxies – environment and ethos, curriculum, size and social mix – to assess potential academic quality and school suitability for their child.
‘Not liking other schools’ was an ‘important’ or ‘very important’ motivator for 80.1% of Free School parents, compared with 60.4% of non-Free School parents. While negative perceptions of other local state schools led some parents to choose a Free School, others drew positive comparisons with private or grammar schools, models that they perceived to be successful and desirable.
The avoidance of certain areas or groups of children was also important to some Free School parents. A muddied distinction between school performance and student composition emerged, with parents, in some cases, understanding the two issues synonymously.
Almost two thirds (61.0%) of Free School parents said that a traditional approach to schooling – the promotion of traditional values, an academic curriculum, a smart school uniform and strict discipline – was ‘very important’ to them, compared with just over one third (34.3%) of non-Free School parents. School size was also ‘very important’ to 61.0% of Free School parents but just 24.3% of non-Free School parents.
“Since the introduction of the Free Schools programme there have been concerns that the new schools are more likely to attract more advantaged parents and have the potential to contribute to further social segregation between schools,” the authors said.
“The preferences of many parents for features which make Free Schools socially distinctive or for having an advantaged social intake lend support to these concerns. There is a danger that such impressions of social distinction contribute to a less inclusive school environment and lead to increased clustering of certain groups of children within different schools.”
Abstract of the study:
This study examines parental choice preferences following the introduction of the Free Schools policy in England. It reports on two phases of data collection: first, the analysis of factors that Free School and non-Free School parents reported as important in informing their choices; and, second, findings from semi-structured interviews with parents of children attending a Free School. The findings show that Free School parents’ choices were mostly influenced by similar factors to those of parents elsewhere. There were, however, some notable exceptions, particularly in relation to school size, ethos and curriculum. The analyses also highlight how the lack of information available led parents to use proxies to assess potential academic quality and school suitability for their child. The tensions that exist for parents in exercising choice within this new context and the implications for school intakes and diversity within the system are discussed.