2 weeks from now all educationalists will know where to go to next for inspiration. But maybe we’ll need to take a closer look to the PISA results – to be published on December 6 – before booking our plane tickets.
Just some extra input to the discussions to come:
The tests results of the Programme for International Student Assessment (PISA), which often informs the development of academic policies in various countries, often receive rather simplified interpretations. As such, analysis of PISA data does not reflect the entire ‘package’ of school students’ knowledge in one key area – mathematics. This is the opinion of researchers from National Research University Higher School of Economics (Russia), Stanford University (USA), and Michigan State University (USA).
The Organization for Economic Co-operation and Development (OECD) is the leading authority currently monitoring PISA educational assessments (since 2000). This process can uncover changes and developments in various education systems throughout the world, while also evaluating the effectiveness of strategic decision-making with respect to education. However, this type of analysis does not include data on teachers and the ‘educational histories’ of children and youth. Thus, the results of the OECD’s monitoring may be inaccurately interpreted, thus hindering countries’ ability to properly develop and establish educational policies.
The PISA measures the knowledge of 15-year old students in mathematics and the natural sciences, as well as their ability to work with various types of text. However, according to researchers in Russia and the USA, this monitoring can only provide a ‘snapshot’ of the knowledge of such students and, thus, does not consider earlier assessments of their abilities and grades, as well as the conditions affecting one’s education (e.g., the cultural standards and progress of an entire class, the education level of a student’s parents, a teacher’s professional abilities and qualifications, etc.). At the same time, all of these factors may influence the movement (up or down) of the grades on the PISA. Furthermore, the researchers cited the results of math tests, which showed that such additional factors can significantly alter the conclusions drawn from the OECD’s monitoring data.
During math tests, students usually must solve three types of problem. The most important for PISA evaluations are assignments in applied mathematics (e.g., with real-life applications such as measuring a frame for a photograph). A large amount of attention is paid to ‘real life’ problems since PISA tests aim to determine the depth of students’ ability to apply their knowledge in order to solve essential problems. The second type of assignment is text-based problems with a large amount of additional information. In order to solve such problems, students must read through the text, determine what is most important, and then quickly adapt to the unfamiliar format of the given question. The third type are ‘formal’ mathematical problems (algebra and geometry), which require the application of complex formulas. PISA tests pay less attention to such problems, even though ‘formal’ mathematics are much more difficult than applied mathematics, often elevating mathematical thinking to an entirely new level. Furthermore, those students more often engaged in solving such problems usually have better PISA results that those whose teachers tend to focus on practical ‘real life’ examples.
In addition, the cultural sophistication of a student’s family is a key factor (e.g., standard of living, linguistic abilities, what is done in their leisure time, etc.). Students from better educated families with access to many books at home usually show the best test results. Also, such students have the conditions in place enabling them to excel academically. At the same time, classes with many students from educated families are more likely to acquire the skills necessary to solve formal algebraic and geometrical tasks, and thus are able to succeed in PISA testing.
Better qualified teachers can be found in classes where there is a higher average level of culture, and, as such, they tend to give their students more assignments dealing with formal mathematics. During their analysis, researchers also discovered that teachers with a mathematical non-pedagogical background tend to give better instruction, thus helping to boost the average grades of high school seniors.
Thus, best suited for PISA test are those learners who have successfully mastered formal mathematics. The researchers stress that cramming for questions on PISA tests are simply not enough. Thus, in order to ensure a more objective assessment of students’ knowledge and abilities in international monitoring, the factors influencing their success should be considered more fully. At the same time, in its recommendations, the OECD overestimates the role of schools in enhancing academic achievements, but underestimates other ‘areas of influence’ affecting evaluations of students’ knowledge.
The results of this analysis were published in the article ‘Revisiting the Relationship Between International Assessment Outcomes and Educational Production: Evidence From a Longitudinal PISA-TIMSS Sample’ http://aer.sagepub.com/content/53/4/1054.full.pdf+html in the American Educational Research Journal. The article’s authors include: Andrey Zakharov (Deputy Head of the HSE Institute of Education/International Laboratory for Education Policy Analysis), Martin Cornoy (Academic Supervisor, Leading Research Fellow at the HSE Institute of Education/International Laboratory for Education Policy Analysis, distinguished Professor of Stanford University), Tatiana Khavenson (Research Fellow at the HSE Institute of Education/International Laboratory for Education Policy Analysis), Prashant Loyalka (Professor at Stanford University) and William H. Schmidt (distinguished professor at Michigan State University).
Abstract of the article:
International assessments, such as the Program for International Student Assessment (PISA), are being used to recommend educational policies to improve student achievement. This study shows that the cross-sectional estimates behind such recommendations may be biased. We use a unique data set from one country that applied the PISA mathematics test in 2012 in ninth grade to all students who had taken the Trends in International Mathematics and Science Survey (TIMSS) test in 2011 and collected information on students’ teachers in ninth grade. These data allowed us to more precisely estimate the effects of classroom variables on students’ PISA performance. Our results suggest that the positive roles of teacher “quality” and “opportunity to learn” in improving student performance are much more modest than claimed in PISA documents.