Project

Adaptive Comparative Judgement – A groundbreaking tool for assessment and learning

June, 2015

Adaptive Comparative Judgement – A groundbreaking tool for assessment and learning

Adaptive Comparative Judgement – A groundbreaking tool for assessment and learning

What’s comparative judgement?
We make relative judgements more intuitively than absolute ones. For example, ‘Which of these books is heavier?’ is easier to respond to than ‘how much does each book weigh?

Psychometrics pioneer Louis Leon Thurstone proved this with his Law of Comparative Judgement. We took his theorem, and applied it to our technology along with some very clever mathematical algorithms. Our ‘Collaborate’ and ‘Measure’ features, which are powered by Adaptive Comparative Judgement, were born.

How our technology works
We teamed up with mathematicians and researchers at the University of Cambridge and Goldsmiths, University of London to develop and build the technology, based on user needs. After developing the core functionality, we then went on to work with accreditation bodies and academic institutions to successfully test the application of this groundbreaking approach.
Users access the ‘Collaborate’ and ‘Measure’ features from Digital Assess. They’re presented with successive pairs of work and, in each case, are asked to judge which of the pair best meets the stated assessment criteria. They can also add feedback about the work, which remains anonymous and can be used to support summative judgements or provide constructive feedback to students as part of a formative assessment for learning led approach.

The technical bit
Behind the scenes, the algorithm analyses the judgements dynamically as they are being made to generate a scaled rank order based on the collective consensus of all of the judges. Work can then be accurately graded according to the scaled ranking and shared with each learner, along with the feedback.

The outcome
The approach offers exceptional accuracy for both peer-to-peer and educator assessment – up to 98% reliability for open-ended assessments, which is far superior to that which can be achieved by conventional marking. The learning process itself vastly improves. Our customers, including The University of Edinburgh, also found that multiple user input gives learners a deeper understanding of subject matter and stimulates analytical thinking.

At the same time, learners gain a clearer sense of what they’re being assessed on by being involved in the assessment process, thus by ‘showing, not telling’.

“The University is exploring approaches to developing assessment literacy, a key component in taking greater ownership of learning. We have found Adaptive Comparative Judgment to be highly promising. It has already enabled our students to summatively and anonymously peer assess to a very high degree of reliability.”

Ian Pirie
Assistant Principal of Learning and Development, University of Edinburgh

“Speaking about the features formative peer-review application – “It is a crowd-sourced, social media style feedback and assessment tool, which is really innovative and powerful….it gives you insight and perspective on your work that you wouldn’t normally get. You get to see the ways that other students have tackled the same problems. It helps you develop work and general analytic skills, which will prove really useful in the workplace.”

Briana Pegado
final year student and 2015 Students’ Association President, University of Edinburgh

Find out who’s using this technology