Building trust in exams

by Andreas Schleicher
Director, Directorate for Education and Skills


Quality high-stakes exams have always been one of the most reliable predictors of the performance of an education system. They signal what matters for educational success and they ensure fairness and transparency in the gateways to the next stage of education or to the workforce. But getting the design of exams wrong can hold education systems back, narrowing the scope of what is valued and what is taught, or encouraging shortcuts, cramming and integrity violations. 

So if you are searching for promising practices in this area these days, it is worth having a look at Russia. Surprised? True, for a long time Russian’s had lost trust in exam scores and degrees because of fraud and misconduct in examinations. But for well over a decade, Russia has worked persistently on addressing these issues and its unified state exam now offers one of the most advanced and transparent ways of assessing student learning outcomes at school.

For a start, Russia has not fallen into the trap of sacrificing validity gains for efficiency gains and relevance for reliability that is so common to many exam systems. So you find no bubble sheets and few multiple-choice tasks. Instead, the tasks are open-ended and often involve essays, focussed on the acquisition of advanced knowledge, complex higher-order thinking skills and, increasingly, on the application of those skills to real-world problems. Many of those tasks probe for understanding and prompting for further thinking, by asking students questions such as: Who is correct? How do you know? Can you explain why he or she is correct?

But the biggest accomplishment of Russia’s unified state exam has been in re-establishing trust in education and examination. Trust cannot be legislated. And trust doesn’t just happen, it is always intentional and it is at least as much a consequence of the design of an exam system as it is a pre-existing condition for its conduct. So how did Russia go about that? For a start, it has invested in state-of-the art test security that is now available across the country. The exam papers are packaged and printed in real-time at the point of delivery, in the classroom under the eyes of the students and the examiners - and under the eyes of a 360-degree camera that monitors and records the entire exam process.

At the end, the exam papers are scanned, digitised and anonymised, once again under the eyes of the students. Where more complex responses to essays cannot be scored automatically by machines, they are marked centrally by independent and specially trained experts, with extensive checks for inter-rater reliability. Of course, there is always some judgement involved in scoring essays. So how can students trust that they were graded fairly? Actually, they can have a look for themselves. The fully-marked exam papers are posted online and all students can review their results. And they can contest the markings if they are not happy, something which a few percent of them do each year. Schools, too, can see and track their exam scores. So if Russian students, teachers, school leaders, and also employers are now much more confident in schooling and examination, this has not come about by chance.

Has it led to improvements in outcomes? Not yet, the exam results were flat lining over the last years. Which shows how much Russia still has to do to feed data back into helping students to learn better, teachers to teach better, and schools to become more effective. But it also shows that the exams proved resilient so far against one of the most common diseases of assessments - grade inflation.

Links:
www.oecd.org/russia 

Comments

Popular posts from this blog

Does the world need people who understand problems, or who can solve them?

How to transform schools into learning organisations?

How to surf the new wave of globalisation