Blog

TAG wins silver in 2014 eLearning Awards

eLearning Silver Award 2014

eLearning Silver Award 2014

TAG Assessment win silver in the category ‘Best use of social and collaborative technologies for learning’, for the University of Edinburgh’s use of CrowdAssess to support peer assessment and attainment.

Competition for this year’s E-Learning Awards was turned up a notch. An independent judging panel, chaired by Tony Frascina, whittled down more than 250 entries to produce an exceptionally diverse shortlist that really showcased the amazing range that e-learning now covers. Competition was so fierce that the judges decided to split a couple of the categories into UK and International sectors. The 2014 awards evening, held at the Marriott Hotel Grosvenor Square in London in November, celebrated the strength and depth of e-learning that is taking place across the globe. 550 people attended the biggest gala evening to date. TAG Assessment won Silver in the category ‘Best use of social and collaborative technologies for learning’, for the University of Edinburgh’s use of CrowdAssess to support peer assessment and attainment. LV= was awarded Gold and Bespeak – Slagerspassie (The Butcher Inspiration Platform) won Bronze in the category.

Blog

The State of the UK Exams System

The Need for Change

With massive growth in spend and little improvement in standards, governments are under pressure. Karim Derrick looks at the state of the UK exams system and discusses why fundamental changes need to take place.

Over the past two decades, there has been a seismic shift in the way we approach education. Thanks to technology, things which could never be conducted efficiently and cost effectively in a classroom setting – such as sciences experiments – can now be simulated using computer software. Lessons can be introduced and enhanced with lively digital content thanks to interactive whiteboards. Schools have moved from having a single archaic computer to having suites of ‘connected’ devices, which students can access throughout much of their day. Now, teachers and learners can use technology to work in a way that is interactive and collaborative. Overall, it has helped to completely reshape teaching practices, classroom management, pupil engagement and empowerment, and curriculum drivers.

The business of education has grown significantly over this period; over $4.4 trillion is spent globally on education now, an increase of 84% since 2000. Yet in that time, there has been little improvement in attainment. Student dropout rates in higher and further education are increasing, which has indicated a huge disconnect. We seem to have less to show for our increasing spend on education, certainly in terms of student engagement and attainment.

The cost of learning

This has brought us to a watershed moment, especially within Higher Education. Soaring costs for university courses has meant that students are becoming more demanding. They want individualised learning options and are increasingly looking online for other effective and more cost-effective degree options. As a result, universities are having to implement new strategies to engage and retain their student cohort. Vocational education is also undergoing radical change, driven by employer needs, with a new focus on apprenticeships and core subject knowledge. The aim is to produce employable school leavers at a time when youth unemployment is at a historic high. These changes are being driven in a major way, not only by the impact of ubiquitous technology, but also by the demands for skills and knowledge placed on potential employees by the global economy.

Education is changing. But has the UK examination system kept pace with this? Are we asking the right questions? Are we leveraging technology effectively? Are we capturing the right skills and competencies that will be of use to learners as they move from school to Further or Higher Education and then into work?

Delivering 21st century skills

The primary challenges facing today’s education system are the validity and reliability of examinations. Can we make our assessment judgements robust and reliable and meaningful?

Businesses are looking beyond degrees and qualifications for employees who are critical thinkers, skilled with technology, and capable of working collaboratively and independently. They also want to see a validation of skills as well as proof of a prospective employee’s knowledge. But these skills are not being evidenced or assessed by our current UK examinations system. In fact, within our existing qualifications, there has been a shift away from coursework recently. So how can we measure scientific capability, or design or IT capability, without actually doing some scientific experiments, or designing or building something? It is that proof of capability and competence that a potential employer truly wants to see evidence of.

Just as technology has had a significant impact in the classroom; it can also have a similar impact on the way we assess. Today, we can even use personal devices to capture video and audio evidence of a student’s capability, processes, planning, brainstorming, and critical thinking. Furthermore, collaborative technologies can help to foster peer assessment, not to mention self-assessment and analytical skills. All of this is crucial for improving student attainment and driving better engagement. These same video and audio technologies which are readily available can be used to prove authenticity. And if we start focusing more on process rat her than just on the end product, you can also use video and audio technology to safeguard against cheating. Examinations can and should be much more than end of qualification multiple choice tick box summation. With existing technology, we can make them more about capturing a process, evidencing a wide range of capabilities as well as other 21st century skills, and also about demonstrating knowledge.

Vocational learning and qualifications are now at the forefront of providing evidence of meaningful skills, knowledge and competencies. Thanks to FELTAG’s work and recommendations, the Department for Business, Innovation and Skills has announced a move towards ensuring that more and more vocational qualifications become e-qualifications. This does not mean that learners taking these qualifications will merely take online multiple choice tests that are marked automatically. Instead, these e-qualifications will capture evidence of pupil knowledge and skills wherever these may be demonstrated. Pupils will also be able to use online technology to interact with their teachers and peers, to manage, hand in, and submit this evidence securely. All of this work can then be submitted directly to the Awarding Body whose verifiers and moderators can also view this evidence in order to mark it. Technology means that we no longer have to sit down for multiple choice exams.

Making assessment reliable

Most fundamentally, technology can help to ensure t hat assessment judgements are made as reliable as possible. A number of UK awarding bodies are already piloting a new approach to assessment known as, Adaptive Comparative Judgement. So far, it has delivered summative assessments that have yielded high reliability without compromising validity. By harnessing assessors’ own ability to make professional judgements and by deploying an adaptive and collaborative web-enabled version of the well established Law of Comparative Judgement (Thurston 1927), ACJ provides a highly reliable way of assessing student work. Deployments in a range of different contexts – Awarding Organisations, Ministries of Education and government agencies – around the world have proven that extraordinarily high assessment reliability can be obtained in areas where it is not normally possible; such as in the assessment of essays and media rich or open ended portfolios of evidence. And because such technology is delivered online, assessors can access and mark student examination work, quickly and cost effectively.

Fundamentally, our examinations system needs to move forward. There have been many ‘tweaks’ to our formal exams over the last few years, but have they really delivered positive change? Perhaps the way forward and to achieve improved assessment is not to emulate the past, but to embrace the future. Technology can help. But what we choose to assess and how we choose to assess it are absolutely key.

Karim Derrick, Chief Operating Officer

News

What the Education System can Learn from Us

Think Journal

Think Journal

This summer, just as every year, the UK’s 19th Century exams system has been under national scrutiny. But while our children in schools sit in exam halls and attempt to demonstrate the outcomes of their learning with a three-hour written paper, here in the world of work-based L&D there are shining examples all around us of how it should and could be done.

In the professional world of training, from centres of learning to corporate and government workplaces, there exist different tried and tested assessment methods that represent a far more robust, valid and accurate way of assessing people’s skills, knowledge and competencies.

The contrast between the assessment practices still in use in schools and the methods we use in the professional space is a stark one. So why does it matter so much? Because it correlates to the often glaring gap between the skills our young people leave school with and the requirements of UK employers.

Thankfully, bodies such as the eAA (e-Assessment Association) and ETAG (the Government’s Education Technology Action Group) are working to close this gap. We would argue that we don’t need to look too far for evidence of how improvements can (and do) work.

Examples

A major development in recent assessment technology is the idea of adaptive testing. This is a computer-based testing model that automatically pinpoints areas where learners can improve. As the difficulty of each test is tailored to the student’s ability, the candidate has a vastly enhanced learning experience.

If the student gets a question wrong, an easier one is generated, or if they get one right, a harder one comes up. All questions come from the same “item bank”, so the results can be graded and standardised nationally – and because questions or tasks from this item bank are randomised (as they are in the UK’s Driver Theory Test), there is no chance for cheats to predict, copy or share test content.

This technique is already widespread in the US and Australia, and is used here on IT courses and modules on quality management by NHS Scotland. Provided there are enough subject matter experts available to create a large enough item bank, this makes it possible for cohorts of thousands of learners to take their tests whenever they are ready, rather than waiting for an annual, end-of-course exam window as they do in schools.

This concept of test-when-ready is also used to great effect by the SQA (Scottish Qualifications Authority) and delivered by eAA member Calibrand, for the Diploma in Professional Financial Advice.

Introduced by the financial services regulator (FCA), this qualification was brought in to improve standards and is a requirement to practice in this industry. Busy financial advisers have online access to all learning and mock exam materials 24/7, and can take modules and practice tests as many times and in whatever order they need, until they are confident enough to pass the real test under high-stakes exam conditions.

Calibrand has developed this course on the principle that advisers don’t want expensive text books or a formal, costly and restrictive programme, instead they want low cost, readily available learning and assessment materials so they can carry on doing their job and learn as quickly and easily as possible at other times. They need accreditation, not only by an academic awarding body but by their employment sector regulator the FCA, allowing them to practise and thrive. Many of these principles can be applied to the education sector.

Away from the office-based environment too, advances have been made to ensure a better way of measuring other vocational skills. For example, with on-screen testing, technology can allow an emphasis on non-written items such as pictures, diagrams and drag-and-drop tasks, so language or literacy are not barriers.

Of course human input is still vital for assessing certain types of answers, but technology has allowed us to move beyond just multiple choice, and towards other mechanisms that can still be automatically marked.

In more practical and vocational subjects in schools, the use of this technology would also bring the significant benefit that teachers are saved a dramatic amount of admin and marking time.

All of the major awarding bodies have already adopted this method – specifically when assessing competency and capability rather than knowledge – again we see it in action in the government-owned Driver Theory Test, which is currently delivered by eAA member Pearson VUE.

To take on-screen testing even further, we are now even seeing simulations being used successfully, to immerse the learner in the simulation of a real-life scenario and assess how they respond to applying their knowledge in the appropriate context. This practical assessment of skills is delivered using web and tablet based technologies.

Simulations are already used for examining IT jobs such as database engineers, plus European Union customs officers use them to assess front-line jobs. The method is also ideal for many engineering and agricultural jobs, particularly as both industries are increasingly using robots and drones.

Another area where schools can learn from work-based learning programmes, from apprenticeships to sector skills councils, is in supporting coursework. The concept of e-Portfolio, as promoted by providers such as TAG Assessment, streamlines the vocational learning experience by allowing a student’s work to be assessed, verified, graded and given feedback remotely by the learning provider or a third party. e-Portfolios also mean students have an up-to-date, interactive representation of their achievements as they develop their skills.

This approach is useful wherever a portfolio of evidence is needed to demonstrate practical skills or on-the-job-training. The majority of colleges use e-portfolios as well as the major awarding bodies and sector skills councils including the Construction Industry Training Board.

Finally, perhaps the biggest lesson the schools system can learn from the world of work is how to harness the idea of work-readiness. Educators need to be able to measure employability skills, and innovative, evidence-based e-assessment technologies are proving to be the best way to capture and assess evidence of student knowledge, understanding and practical ability directly relevant to the workplace.

Awarding bodies and institutions including OCR and Edinburgh University are now offering new qualifications that focus on real employability skills from ICT, numeracy and literacy to entrepreneurial and business skills that translate directly to the real world.

The key is to directly address the increasing demand from employers for the skills they need. Should schools scrap written exams altogether? Of course not – in some academic subjects they may still be entirely appropriate. But we need to find a balance. And to find that mix of 21st Century skills – the balance between knowledge and capabilities – we need scalable, reliable, robust and appropriate assessments.

Matt Wingfield, Chief Business Development Officer, Digital Assess  and Chairman of the e-Assessment Association