Author(s):Sylvia Vitello (presenting), Prerna Carroll, Jackie Greatorex, Jo Ireland

Conference:ECER 2016, Leading Education: The Distinct Contributions of Educational Research and Researchers

Network:02. Vocational Education and Training (VETNET)

Format:Paper

Session Information

02 SES 12 A, Recent Developments in Assessment in VET

Paper Session

Time:2016-08-26
09:00-10:30

Room:Vet-Theatre 116

Chair:Marthe Geiben

Contribution

Employers’ Views On Assessment Design In Vocational Qualifications: A Preliminary Study


There have been recurrent debates about assessment design in vocational qualifications both in the UK and internationally, dividing opinion on which methods of assessment should be used (e.g., portfolios) and who should mark and set them. In England, the education system is currently undergoing governmental reforms. The department for education (DfE) now requires vocational qualifications (taken by 14-19 years olds) to contain a certain amount of external assessment and to involve employers in the delivery of the course and/or assessment (DfE, 2015a, 2015b).

Employers are key stakeholders of vocational qualifications; their opinions affect a candidate’s entry and progression in the labour market (Wolf, 2011). To ensure that qualifications are valued by employers, it is important to explore the views of current employers on assessment design and examine the extent to which they align with current practices, reforms and theoretical perspectives. It is also useful to understand the extent to which their opinions converge with assessment practices internationally, as this would help understand the feasibility of adopting preferred approaches.

A diverse variety of assessment methods have been used in vocational qualifications in the UK. Methods have often been chosen because of specific theoretical perspectives on the construct of vocational understanding and pedagogy. For example, performance assessment has been a dominant method used in work-based qualifications (Johnson, 2008). These qualifications are rooted in a performance-based view of vocational competence, which argues that a person’s competence can be exclusively assessed through their performance on occupational tasks (e.g., Jessup, 1991). This approach to vocational qualifications has been criticised for being reductionist in its view of vocational competence and assessment methods. For example, Hodkinson (1992) argues for an interactive view of vocationally-related understanding that is assessed with a range of methods of assessments, including tests of performance as well as learning logs, journals and written tests. Accordingly, VET courses in many European, American and Australian countries include a combination of written, practical and oral assessments, although the specific tasks vary across countries and qualifications.        

Opinions on assessment design may also be influenced by the distinction between skills and knowledge. Bathmaker (2013) notes that vocational education in England has been focused on skills but “there is now a growing interest amongst researchers in the question of ‘knowledge’ in vocational education” (p. 88). This shift in importance may have implications for the kinds of assessments that are valued by employers. 

Less research attention has focused on who should mark and set VET assessments. Currently, many vocational courses in the UK are teacher-assessed. The UK government has recently specified that a certain amount of assessment should be marked and set externally; that is, it should not be marked or set by the centre that delivers the course to the students. The international VET systems employ a diverse range of markers/setters. Some countries use a single type of marker and setter for all their methods of assessment (i.e., all externally or internally assessed) whereas, in other countries, externality varies across assessment methods.

The aim of this study was to conduct a preliminary exploration of employers’ opinions on different types of methods of assessment, markers and setters. Employers’ perceptions about assessment may influence their opinions about the qualifications, in turn affecting the value of such qualifications in the labour market. The findings should help elicit discussions about assessment design and how to ensure that employers value the qualifications offered to students. Methodologically, this preliminary study also provided insight into how feasible it is to engage employers in research on vocational qualifications.


Method

A mixed closed and open format questionnaire was employed. The questionnaire was completed by four employers working in the Information Technology (IT) sector and three working in the Health and Social Care (HSC) sector. The employers were given assessment criteria from vocational qualifications in IT or HSC according to their profession and asked to select the single best method, marker and setter for each criterion and to state reasons why. 18 assessment criteria were included in the IT version of the questionnaire and 10 criteria were used in the HSC version; thus, in total, 72 judgements were made by the group of IT employers and 30 judgements were made by the HSC employers. Employers were presented with a range of methods, markers and setters to choose from that have been employed in vocational qualifications in the UK and internationally.

The data for each sector were analysed separately. Responses to the closed selection questions were analysed by calculating the frequency with which each method/marker/setter option was selected. The thematic framework was used to analyse the reasons employers gave for their selections (Braun & Clarke, 2006).


Expected Outcomes

Each IT employer chose a range of different methods. HSC employers were more selective in their choices, although this could be due to the sample of assessment criteria. Many employers, especially of IT, showed a strong appreciation for methods typically associated with assessing theoretical knowledge such as written questions. Employers gave a variety of reasons for their methods choices, commenting, for example, on the content of the assessment criteria, learning outcomes, student needs and validity issues.

Employers preferred a more limited range of markers and setters and their preference for internal versus external options varied across sectors. Each IT employer chose a combination of internal and external markers and setters. The most popular markers were company employees (internal and external) and occupationally experienced teachers, which suggest that occupational experience may be particularly important to these employers. The most popular setters were experts from an awarding body and teachers.

The HSC employers almost always chose internal markers, specifically internal company employees or teachers but preferred setters to be experts from an awarding body or teachers. This suggests that externality may be more important for the setting process.

The employers, again, commented on a variety of factors when justifying their choice of marker and setter, including their perceived reliability and validity, personal characteristics, the assessment criteria and student learning needs.

Together, the findings showed that employers valued a range of assessment methods, desired occupational experience especially for markers and had somewhat different preferences for markers and setters. The diverse range of reasons suggests that these employers have a multi-faceted view of assessments and their advantages. Although these employers provided insightful responses, it proved difficult to recruit a larger sample, which is necessary to understand how representative their views are. Further research is needed to understand barriers preventing employer engagement in research.


References

Bathmaker, A.-M. (2013). Defining ‘knowledge’in vocational education qualifications in England: an analysis of key stakeholders and their constructions of knowledge, purposes and content. Journal of Vocational Education & Training, 65(1), 87-107.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), 77-101.

DfE. (2015a). Technical awards for 14 to 16 year olds. 2017 and 2018 performance tables: technical guidance for awarding organisations. England, United Kingdom: Department for Education.

DfE. (2015b). Vocational qualifications for 16 to 19 year olds. 2017 and 2018 performance tables: technical guidance for awarding organisations. England, United Kingdom: Department for Education.

Hodkinson, P. (1992). Alternative models of competence in vocational education and training. Journal of Further and Higher Education, 16(2), 30-39.

Jessup, G. (1991). Outcomes: NVQs and the Emerging Model of Education and Training. London: Falmer Press.

Johnson, M. (2008). Assessing at the borderline: Judging a vocationally related portfolio holistically. Issues in Educational Research, 18(1), 26-43.

Wolf, A. (2011). Review of vocational education: the Wolf report: Department for Education and Department for Business, Innovation & Skills.


Author Information

Sylvia Vitello (presenting)
Cambridge Assessment
Cambridge
Prerna Carroll
Cambridge Assessment, Cambridge, United Kingdom
Jackie Greatorex
Cambridge Assessment, Cambridge, United Kingdom
Jo Ireland
Cambridge Assessment, Cambridge, United Kingdom