Here we define the key characteristics of soft skills Assessment Exercise design and assessment exercise design. 

Soft skills Assessment exercise design

  • Individual assessment exercise design
    • Group Exercise design
    • In-tray Exercise design
    • Fact Find Exercise design
    • Role Play design / Interactive Exercise design
  • Competency-based Interviews and Interview Guide Design
  • Development Centre design and development exercise design

Competency Assessment Design

An effective competency design framework is crucial to the operation of many HR practices.

Shown below is an example of a typical competency design framework used in an assessment centre. In this case, the group exercise’s competency component of the overall assessment centre competency matrix:

– Oral Communication

How clearly and confidently the individual communicates with the group.

– Planning and Organising

– Judgement and Decision-Making

How logically the individual makes their decisions and judges other participants’ input/comments.

– Analysis and Problem-Solving

How effectively the individual analyses the scenario’s issues and the solutions proposed.

– Finding Solutions

The number and effectiveness of the ideas generated.

– Teamworking

How well the individual works with and encourages the other group exercise participants.

Assessment Centre Exercise Reliability

The reliability of any exercise depends upon many factors, in particular the…

  • Quality of the competency framework
  • Use of experienced and well-briefed / well-trained assessors

Role play design

Role play design is based on assessing those skills, knowledge and behavioural competencies required for performing the role effectively. You need to know key facts like this as part of your role play preparation.

Role plays simulate how candidate interaction with a customer, manager or colleague.

Typically, in a managerial assessment centre or development centre, the candidate will be placed in the simulation’s managerial role for which they are applying, The simulation’s company will similarly parallel the type of company and company sector to which they have applied.

Simulations typically involve interactions with colleagues, peers and / or senior managers. For more specialist roles the role play design can be even more specific.

In tray exercise design

In-trays simulate the (administrative) day-to-day issues someone in the assessed role would face. The candidate must offer solutions for as many items as possible,  a typical in tray will instruct the candidate to prioritise the full set of issues and their associated actions.

assessment design. Man at computer designing

In Tray Design – Pros and Cons

  • Tricky to develop
  • Take 2-3 times the time of other assessment exercises to develop
  • Require a mixture of job-relevant items to be supplied by 1-2 job incumbents
  • High face validity with a cross-section of role-specific issued raised
  • Provides some timetable flexibility, since in trays exercises can be scheduled for individual candidates to complete, or with two or more at a time
  • Can assess a range of different job specific competencies, particularly organisational and quality-focused competencies. For example:
  • Problem analysis and solving skills
  • Time management and prioritisation
  • Delegation skills and decisiveness
  • Concern for quality and equality; for customers
  • Responsiveness
  • Leadership style and motivational skills
  • Skill in the officer/Member interface
  • Performance management
  • Technical knowledge
  • Coalition or partnership building skills
  • Written communication
  • Planning and organising
  • Strategic thinking and breadth

Checklist for Designing the In-tray Exercise

  • Have clear assessment criteria
  • Each item must be based on an existing work related issue.
  • Clear and concise candidate instructions are required. Here are the key points to highlight in the in tray instructions:
    • Candidate’s assumed name for the purpose of the exercise (the addressee on all correspondence);
    • Candidate’s assumed job title (usually the job being applied for);
    • An explanation of the contents of the in tray exercise (as consisting of memoranda, letters, reports, e-mails and other documents).
    • The specific time and date, i.e. the in tray exercise context.
  • Group exercises typically involve a group of four to six candidates given a group or individual exercise briefs.

Fact Find Exercise Design

The fact find exercise used in assessment centres for specialist roles is an excellent means of assessing analysis and decision-making skills.

Fact find exercises aren’t as complicated as they sound. Imagine how a journalist will probe a Government Minister. Here’s how a fact-find exercise works. Imagine a scenario where a journalist asks their interviewee a series of questions. The journalist is investigating the facts behind a story. In a fact-find exercise the candidate must ask a series of questions to establish the facts about a situation.

Fact Find Exercise Example

It will be clear that information is missing from this scenario situation. Also, there will be no clear correct or incorrect response. In fact, it is the Resource Person who holds all the information and is willing to provide you with answers… as long as you ask the right questions.

The task set is usually broken down, with a few minutes allocated to several exercise phases, such as:

  1. Create a list of questions to “fill in the blanks”.
  2. To direct each of your questions in turn at the Resource Person. Whilst they will have background information to the incident scenario they will not be able to advise you of how you should respond.
  3. Review the situation again in the light of this additional information.
  4. To consider how you would respond.
  5. Present your recommendation, along with your reasoning backing-up your decisions.

Candidate preparation for taking any assessment

  1. Preparation may be seen as the responsibility of the assessment provider as well as the candidate. Assessment guides may be used to alleviate candidate anxiety. Giving advice such as ensuring that they get a good night’s rest.
  2. Candidates who have not encountered verbal assessments before can practice by doing crosswords and other word puzzles. Preparation for a numerical reasoning test could also be done using mathematical puzzles.
  3. All assessment practice can calm nerves and increase confidence it is unlikely to significantly improve the final test scores.

The best preparation is really to have done similar type tests before. One major issue is the time – learn to do things being highly aware of the time it takes – this really improves test scores.

CHARACTERISTICS OF ASSESSMENT CENTRE DESIGN

The typical contents of an assessment centre are

Matrix based assessment

Assessment based on set criteria

Multiple assessment of individuals

Several assessors are involved

Information gained is integrated

Assessment Centre Design – Multi-Method Assessment

Assessments are based on a variety of different techniques.  These can range from simulation exercises, tests, interviews and self-reports.  This helps to ensure that full coverage is given in measuring the attributes, behaviours and skills and in turn increase the reliability of assessment.     

Based On Set Criteria

The centre should always be based on clearly identifiable and measurable criteria or dimensions (Competencies). These characterise the behaviours that lead to successful performance in the workplace.

Multiple Assessments of Individuals

Multiple Assessors Involved

Through the use of multiple assessors, it is possible to ensure that personal bias does not interfere in the assessment of any one individual.  In this way, it helps to increase the objectivity of assessment with each individual being assessed by as many different assessors as possible. 

Information Is Integrated

Decisions and recommendations resulting from centres should be based upon a collection of the data gained having been thoroughly discussed and debated in an integration session.

THE HISTORY OF ASSESSMENT CENTRE DESIGN

In Germany and Britain, officer selection was based on the qualities required for soldiers in conjunction with tests of intelligence. 

In 1942 War Office Selection Boards (WOSB) were set up using a mixture of psychiatrists, psychologists and military officers to assess candidates’ performance with regard to criteria set for officers.  These assessments which included group discussions, short lectures, obstacle courses, leaderless group tasks were the precursor of the Assessment Centre as we know it today.

British Civil Service

FollBased on the success of the work of the WOSB the Civil Service Selection Board (CSSB) was created.  Validation of the tests proved them to be extremely good predictors of job performance leading to their introduction in parallel with the traditional examination and interview after 1958.

The AT&T Development Centre

In 1956 The American Telephone and Telegraph Company (AT&T) set up a Management Progress Study.  The study was aimed at identifying those variables, which led to success at AT&T.  Due This allowed the data to be stored away and not used for its usual purpose of selection and promotion.  The study lasted over 25 years and is renowned as one of the validation studies carried out in the area.  Results appeared to indicate a strong link between performance in the Centre and performance in the workplace over a substantial period of time.  The AT&T Company used a designated building to carry out the assessments.

The Development of Assessment Centre Design

The 1970’s saw a rapid growth in the use of Assessment/Development Centres in the United States.  Estimates now vary on the extent of usage from figures of 300 to 2000 companies.

COMPETENCY-BASED ASSESSMENT

COMPETENCIES – THE BUILDING BLOCKS OF ASSESSMENT

This section will aim to provide an overview of competencies; how they relate to Assessment Centres and where they come from.

Competencies are a cluster of behaviours that have been identified as integral to highly effective performance in any given role.  

‘…..clusters of behaviour that are specific, observable and verifiable are assessed in assessment centres’

An effective competency framework is:

  • Observable & Measurable – the behaviours incorporated in the competency framework must be clearly visible and quantifiable.
  • Discrete – there should be no overlap between competencies. 

Supported by behavioural indicators. Using both positive and negative indicators of performance.

Business driven

The framework must be related to the business needs of the organisation in order to promote behaviours that will drive the business forward.

  • Future-proofed 

Assessment Exercise Design

assessment design. Candidates at table

METHODS OF ASSESSMENT CENTRE DESIGN

To effectively measure the competency framework developed for a role, assessment/development centres employ a wide range of exercises.  Some assessment methods are individual tasks. Others involve the interaction of the candidate/participant with either the assessors or other candidates in the centre.

Interactive Exercises

Group Exercises

  • Role exercises with particular roles to play in solving the issue presented to them. 
  • Unassigned exercises use the same brief and they work as a team to solve the particular task/issue presented to them.

Presentations

Role Plays

THE ORCE MODEL

Assessors must adopt a consistent and systematic approach to the assessment of the exercises.

  • Observation

  • Recording

  • Classification

  • Evaluation

Assessment Exercise Design

STAGE ONE- OBSERVATION

Assessors must stay alert at all times and concentrate on actively observing the candidate/s assigned to you.

STAGE TWO – RECORDING

Whilst it is important that you are observing the candidates behaviour in any simulation exercise it is critical that you record everything you see clearly.  It is this record of behaviour which will form the basis of your assessment and enable you to ‘rate’ an individual’s performance on the exercise. 

It is critical that whilst you are observing and recording information from an exercise you withhold judgement of the behaviour seen.  You must only take notes on what you have observed and attach no value judgements to what have you have seen.  (during the 1 to 1 exercise with the assessor you may observe that the candidate consistently over talked you). To suggest that this upset you as the assessor would be an evaluation of the situation.

STAGE THREE – CLASSIFICATION

Having observed and effectively recorded the information presented in an exercise the next task is to classify the information appropriately against the relevant competencies. 

In a presentation exercise:

‘Jill spoke clearly and concisely throughout the presentation and on several occasions checked the understanding of the observer by summarising and restating what had been said.  Her proposal did not appear to have taken consideration of all the relevant factors making several assumptions without checking….’

STAGE FOUR – EVALUATION

To distinguish between the information gathered and enable you to make an effective accurate assessment.

Assessment Exercise Design

USEFUL ANALOGIES

The Paint Pot Analogy

For some competencies, you will see either a presence of behaviour or absence of behaviour.  FIt is helpful to think of an empty paint pot.  The pot represents the amount of opportunity the simulation offers to show creativity.  The amount of paint in the pot represents the amount of behaviour you observed against the opportunities offered.

The Scales Analogy

Here evaluation is based on a process of weighing up the positive behaviours versus the negative behaviours to see the balance achieved.

As with the paint pot model, there is a degree of judgement involved in this approach. 

THE RATING SCALE

You should write a summary of the evidence gained for each competency using the summary forms provided and assign a rating to each competency assessed.  There are a range of different types of rating scales.

PITFALLS TO AVOID

·         Just like me: Assessors often fall into the trap of positively rating candidates who are “like” them in a subjective way.  Try to assess the described competencies – not your own subjective criteria.

assessment design. People around table in meeting

Assessment Exercise Design

THE INTEGRATION PROCESS

INTEGRATION MEETING

  • The purpose of the meeting is to provide an opportunity to share these ratings and the evidence gathered with the other assessors.
  • Before the integration meeting, or ‘wash-up’ you will have observed and rated behaviour independently across a range of exercises.
  • The meeting allows the group of assessors to come to a common, agreed view of the candidate’s performance in the centre.
  • The integration sessions at the comprise each pair of assessors, with the centre manager on hand.

COMMON ISSUES ARISING FROM AN INTEGRATION MEETING

  •    The assessors who marked the candidate on the same competency in another exercise may also have to read out their ratings and evidence.
  •    You will need to listen to this and discuss any discrepancies you see.  It is important that you listen carefully during this process and question each other with regard to the evidence provided.

“Assessment Centre Mathematics”

Remember that these numbers simply reflect behavioural evidence.  It may be that on assessor gives a candidate a rating of 3 for a particular competency whilst another gives them a 4.  Remember that different exercises may provide differing opportunities to display certain behavioural indicators for a competency.  Always ensure that you go back over the evidence in detail to see the extent to which the candidate has managed to cover the competency definition and indicators across the different exercises.

Conflicting Data & ‘Primary/Secondary’ Evidence:

Sometimes you may find that different exercises will provide you with completely conflicting evidence.  When this is the case make sure you not only check the extent to which the competency definition has been met but also check which exercise provides the primary source of evidence.   The general rule should follow that when data is conflicting the overall rating should follow that of the primary source of evidence. 

Assessment Exercise Design

Rating Not Matching the Evidence

It is important that assessors are open to discussing and debating the ratings they have given to candidates they have assessed.  Final ratings should be based upon a consensus of agreement among the group of assessors. 

THE INTEGRATION MATRIX

The Integration matrix is similar to the assessment matrix but designed for just one candidate/participant. 

CENTRE MANAGEMENT

In this section, we look at the practical issues in running your assessment centres.

Our Assessment Centre Design

TIMETABLE

You will need to take account of a number of factors when deciding upon a
timetable:

  • Availability of assessors and candidates – What time commitments are plausible from these groups of people.
  • Numbers – We recommend that you design centres to run with either 6 or 12 candidates, and assume a candidate: assessor ratio of 2:1.
  • Venue – What rooms are available?  What is the layout of these rooms?  Remember that the assessors will need a ‘bunker’ room for write-ups etc.
  • Exercises – How long are the exercises?  Remember to include administration and scoring time into the timetable.

Assessment Exercise Design

MATERIALS FOR THE CENTRE

The assessor manual should include copies of all the exercises, but can also form the course notes for assessor training.  A manual might include some of the areas covered within this manual.

In addition, the following materials will need to be made available for the assessment centre:

  • Exercise booklets for candidates and assessors
  • Observation forms, scoring guidelines and summary sheets
  • Any test materials
  • Pens, pencils, erasers, rough paper, calculators and stop watches.
  • Candidate Folders (1 per candidate)
  • Name plates / badges

ASSESSOR POOL

The assessor pool should ideally comprise of line managers from a range of functions together with professional assessors.  In addition, some assessors and centre administrators may be HR specialists.

Assessor training is crucial, and workshops typically last between 1 and 2 days. 

Assessment Exercise Design

CANDIDATE BRIEFING

All candidates need to know what to expect from an assessment centre.  The briefing should be friendly and clear.

Essentially, the briefing should cover a number of key points:

  • What is an assessment centre?
  • Why are they used?
  • What will happen on the day?
  • Do assessment centres actually work?
  • How can I prepare?
  • What feedback will I receive and what happens to my data?
assessment design hand writing

QUALITY ASSURANCE

The Centre Manager will have received instructions to monitor the quality of the assessor’s work, and will offer assistance and immediate feedback as necessary.

Time is critical and every minute must be effectively utilised as there is no scope for falling behind in terms of your assessor responsibilities for assessment, rating and summaries in terms of comments.

An Associate Fellow of the British Psychological Society, Rob Williams is a Chartered Psychologist with over 25 years of experience working and designing tests. He is also the author of five psychometric test design books and has worked for the leading global psychometric test publishers including SHL, Kenexa IBM, MBTI, CAPP and SOVA Assessment.

Our other psychometric test design specialities

Our FREE skill surveys

We offer free skill surveys:


Our Practice assessment books 

Brilliant Passing Verbal Reasoning Tests

Passing Numerical Reasoning Tests

Numerical Reasoning book reviews

The cost of this book is definitely worth the outlay when yoiu have a interview coming up. I really like the examples and thought it was close to real test I took.

5.0 out of 5 stars. I got the job.

Our Assessment Exercise Design

Buy this book to pass!

Brilliant book. Read it from cover to cover. Don’t skip the chapters. I know it can seem patronising, the way he breaks it down, but I’d recommend reading it through.

Brilliantly practical and informative

Very clear layout to follow

Very clear layout to follow, questions advance over the chapters and it sets a steady pace. This book helped me immensely to get all the practice I needed within 3 days.

A really useful book