Skills Foresight Report for Hair and Beauty Published

vtct - skills foresight report

Skills Foresight Report for Hair and Beauty

After the successful launch of the Skills Foresight Report for Hair and Beauty, which took place at the House of Commons on 19th July, the details have now been released and a copy of the report can be found here.

Key findings include:

  • Roles are becoming much more technical, requiring greater knowledge and skill, and salons are offering more holistic services which provide a whole new customer experience. Micro-needling and chemical peels are two examples of qualifications that reach degree level standard.
  • There is demand for greater regulation to raise the standards of the industry, both via Government enforced methods such as the Skills Plan and the Apprenticeship Levy, but also through industry-led initiatives such as a mandatory register for hairdressers and barbers.
  • Career routes into and through the sector are not confined to traditional job roles. New and interesting routes have been opened which include pathways into careers such as aesthetic nursing, non-surgical beauty technician, trichologist and a variety of roles supporting rehabilitation and palliative care, as well as dedicated film industry roles such as cosmetologists, and semi-permanent makeup artists.

Launched at the same event was the Hair Council report on the Case for Mandatory Registration for Hairdressing and Barbering, a copy of which can be found here.



Universities worldwide join Digital Assess to pioneer CompareAssess, a new formative assessment method

Adaptive Comparative Advantage - formative assessmentUK based education technology company Digital Assess has partnered with universities in three continents* to trailblaze a formative assessment method that has been proven to improve student satisfaction and boost attainment by as much as 14%.

Digital Assess’ online assessment portal CompareAssess harnesses the assessment method known as Adaptive Comparative Judgement (ACJ), and enables students to receive fast, meaningful feedback through a hassle-free peer review system.

Adaptive Comparative Judgement, which was developed by Digital Assess alongside leading academics Professor Richard Kimbell (Goldsmiths, University of London) and Dr Alastair Pollitt (Cambridge), is based on the Law of Comparative Judgement, which proves that people are better at making comparative, paired judgements rather than absolute ones.

Professor Richard Kimbell explained, “It is easy to compare the temperature of two rooms and declare which is warmer, but it is much more difficult to say with absolute certainty what the precise temperature is in each.”

He went on to say, “Through an anonymous peer review system, CompareAssess engages students in collaborative critique and therefore develops critical thinking. This method helps students to recognise what areas they can improve, and how. Understanding the assessment process and how others perceive their work subsequently influences students’ approach to future assignments, and university academics in the UK have reported an improvement of up to 14% in attainment, especially with difficult to influence middle ability students.”

CompareAssess, which is accessible to schools, colleges and universities worldwide, enables students to compare one piece of work to another side-by-side. They then choose which is better at meeting the desired learning outcomes. Rather than receiving isolated feedback according to a complex mark scheme, institutions can opt to make the assessment process more transparent. It allows students to understand what a good, stand-out piece of work actually looks like, and provides less room for ambiguities.

The efficiency and speed with which students receive their feedback is also increased. Students can access feedback almost instantly, eliminating any lengthy waiting periods, which the National Student Survey shows is one of the main causes of university student dissatisfaction. The quality of feedback is also improved, through incorporating a range of different perspectives and not subjecting work to the personal bias of one marker.

Through recently embedding the ACJ tool into an easily accessible online portal, Digital Assess has ensured that anyone, anywhere, can use it and benefit from the innovative assessment approach it offers.

Matt Wingfield, Chief Business Development Officer at Digital Assess, commented:

“Universities around the world are already using CompareAssess to transform their assessment process and improve the way their students receive and engage with feedback. It enables students to truly understand assessment mentality and how this is applied to their own work. The ability of CompareAssess to improve student satisfaction means that not only is it pedagogically sound, but it also makes good business sense in terms of retaining and attracting new students. In developing this new, user-friendly portal, we have now made it much easier for anyone to utilise the ACJ tool. We hope that many more institutions worldwide will follow the lead of the pioneers and use the tool as their formative assessment method of choice.”

Despite being designed with assessment in mind, CompareAssess is not limited solely to peer-review contexts and can be applied to a range of other disciplines and scenarios that involve making a comparative decision. For example, the ACJ tool was used to develop a free online resource, Classical 100, which was used in primary schools to encourage young learners to engage with classical music. Developed by ABRSM, in partnership with Classic FM and Decca Classics, the use of the ACJ tool allowed the music to be judged and ranked according to its suitability for classroom scenarios using iteration and an adaptive algorithm.

*The international trailblazer group pioneering CompareAssess – the Adaptive Comparative Judgement Tool includes the University of Edinburgh, the University of Manchester, Purdue University (USA), Edith Cowan University (Australia) and the University of South Australia.

Original article can be found here.


Comparative Tech and Meaningful Assessment

Adaptive Comparative Judgement in AssessmentFor students, the process of assessment should always be an integral and valuable part of the learning experience. Receiving feedback that is both informative and meaningful to them enables students to consistently improve their work. Yet for many, the traditional assessment model fails to meet these objectives.

The problem is entrenched within the very language of assessment. Students often find themselves struggling to interpret their feedback according to an abstract mark scheme written in inaccessible ‘marker speak’, which is often not directly applicable to the work they have produced. It gives them no clear indication of what specifically needs to be improved and how, or where they sit in relation to their peers. In short, the assessment is often ‘done to them’ rather than the student being an active participant in the assessment process.

None the wiser

Instead of being a helpful process, students are often simply made to realise that their work hasn’t met a certain standard of criteria, but they are still none the wiser as to what a ‘good’ piece of work actually looks like. If a student is not able to decode the mark scheme and see how it relates to their work, which frankly requires substantial effort on their part, then they will never improve. How can they do better next time, if they don’t engage with or respond to the feedback given?

The assessment process doesn’t have to be this way. An intuitive, scalable, digitised approach, which incorporates formative assessment software based on Adaptive Comparative Judgement (ACJ), has the power to truly engage learners and transform the entire assessment process. It has also been proven to significantly increase student satisfaction and attainment; the latter by as much as 14%.

Developed by academics from Cambridge University and Goldsmiths, University of London, the ACJ approach is based on the Law of Comparative Judgement, as first devised in the 1920s, and proves that people are better at making comparative, paired judgements rather than absolute ones. For example, it is easy to compare the temperature of two rooms and say which is warmer, but it is much more difficult to say with absolute certainty what the precise temperature is in each.

World of comparisons

Applying this same logic to the assessment world, learners should be able to log into a digital platform, look at two pieces of work side by side and identify for themselves which is better at meeting the desired learning outcomes. The software then generates repeated rounds of comparisons using an iterative and adaptive algorithm, enabling students to rank the work and recognise a spectrum of quality. Instead of receiving their own feedback in isolation, this process makes it much more apparent about exactly how they are performing and what precise improvements could be made. You can hear a student’s perspective on this by watching this short student interview here.

As such, educational institutions should seek to embrace anonymous online peer review systems like the one mentioned here, in which students are actively able to participate in the assessment process. Engaging students in collaborative critique makes the logic of assessment much more transparent, and a clear understanding of how their work is perceived by others will undoubtedly influence the way they tackle future assignments.

Currently, numerous institutions worldwide are trailblazing the ACJ approach through this innovative software, but it is yet to be utilised to its full potential. If the technology exists to make the assessment process clearer and more accessible, and which generates more meaningful feedback, why don’t we embrace the widespread application of this?

Matt Wingfield, Chief Business Development Officer

Link to the original story can be found here.