In this section we will aim to define the key characteristics of Assessment Centre design.  The one constant that appears to follow in the description of an Assessment Centre is that of standardised evaluation of behaviour based upon multiple inputs to the process.


The typical contents of an assessment centre are

Matrix based assessment

Assessment based on set criteria

Multiple assessment of individuals

Several assessors are involved

Information gained is integrated

Assessment Centre Design – Multi Method Assessment

Assessments are based on a variety of different techniques.  These can range from simulation exercises, tests, interviews and self reports.  This helps to ensure that full coverage is given in measuring the attributes, behaviours and skills and in turn increase the reliability of assessment.     

Based On Set Criteria

The centre should always be based upon clearly identifiable and measurable criteria or dimensions (Competencies) . These characterise the behaviours that lead to successful performance in the workplace.

Multiple Assessments of Individuals

Multiple Assessors Involved

Through the use of multiple assessors it is possible to ensure that personal bias does not interfere in the assessment of any one individual.  In this way it helps to increase objectivity of assessment with each individual being assessed by as many different assessors as possible. 

Information Is Integrated

Decisions and recommendations resulting from centres should be based upon a collection of the data gained having been thoroughly discussed and debated in an integration session.


In Germany and Britain officer selection was based on the qualities required for soldiers in conjunction with tests of intelligence. 

In 1942 War Office Selection Boards (WOSB) were set up using a mixture of psychiatrists, psychologists and military officers to assess candidates’ performance with regard to criteria set for officers.  These assessments which included group discussions, short lectures, obstacle courses, leaderless group tasks were the pre cursor of the Assessment Centre as we know it today.

British Civil Service

FollBased on the success of the work of the WOSB the Civil Service Selection Board (CSSB) was created.  Validation of the tests proved them to be extremely good predictors of job performance leading to their introduction in parallel with the traditional examination and interview after 1958.

The AT&T Development Centre

In 1956 The American Telephone and Telegraph Company (AT&T) set up a Management Progress Study.  The study was aimed at identifying those variables, which led to success at AT&T.  Due This allowed the data to be stored away and not used for its usual purpose of selection and promotion.  The study lasted over 25 years and is renowned as one of the validation studies carried out in the area.  Results appeared to indicate a strong link between performance in the Centre and performance in the workplace over a substantial period of time.  The AT&T Company used a designated building to carry out the assessments.

The Development of Assessment Centre Design

The 1970’s saw a rapid growth in the use of Assessment/Development Centres in the United States.  Estimates now vary on the extent of usage from figures of 300 to 2000 companies.



This section will aim to provide an overview of competencies; how they relate to Assessment Centres and where they come from.

Competencies are a cluster of behaviours that have been identified as integral to highly effective performance in any given role.  

‘…..clusters of behaviour that are specific, observable and verifiable are assessed in assessment centres’

An effective competency framework is:

  • Observable & Measurable – the behaviours incorporated in the competency framework must be clearly visible and quantifiable.

Dulewicz (1989)

  • Discrete – there should be no overlap between competencies.  Specific behaviours should only be linked to one competency heading to avoid overlap, double counting and subsequent confusion in assessing an individual performance.

Supported by behavioural indicators. Using both positive and negative indicators of performance.

Business driven

The framework must be related to the business needs of the organisation in order to promote behaviours that will drive the business forward.

  • Future proofed . To ensure a competency framework is not simply a snapshot of current behaviours required to carry out the job, efforts should be made to ensure it reflects the likely future requirements of the job.    


To effectively measure the competency framework developed for a role, assessment/development centres employ a wide range of exercises.  These methods of assessment can broadly be split in to two distinct types.  Some assessment methods are individual tasks, whilst others involve the interaction of the candidate/participant with either the assessors or other candidates in the centre.

Interactive Exercises

  • Group Exercises – these involve a group of candidates/participants being given an issue or task to deal with whilst being observed.  They may be assigned role exercises where individuals in a group are give particular roles to play in solving the issue presented to them.  On other occasions they are unassigned exercises where each individual is given the same brief and they work as a team to solve the particular task/issue presented to them.
  • Presentations – On some occasions, candidates/participants are required to make presentations to the assessors.
  • Role Plays – These allow the assessor to observe how the candidate performs in a pre-determined situation that has been designed to allow them to exhibit the particular competencies being assessed.

Assessment Centre Design of Individual Exercises

  • In Trays These simulate the typical in basket of a manager typically including memos, letters and other documentation.  It is the task of the candidate to analyse the information and decide upon the courses of action they wish to take in dealing with issues that arise in the in tray file.
  • Aptitude/Ability Tests – These are often used in assessment/development centres to give a measure of an individual’s ability to interpret or reason with various types of information.
  • QuestionnairesPersonality questionnaires are often used in centres as they can provide a lot of information in a short period of time, about an individual’s preferred way of behaving in the work place.  They are not concerned with an individual’s ability to carry out a task provide an indication of how they would go about dealing with it.  These are only used as a secondary source of information in a centre.




Assessors must adopt a consistent approach to assessment of the exercises. It is critical that a systematic approach is taken.  This section focuses on the key steps in behavioural evaluation and can be remembered using the acronym ORCE.  The key stages are detailed below:

  • Observation

  • Recording

  • Classification

  • Evaluation


Verbal Behaviour can be described aswhat the participant actually says’. Assessors must stay alert at all times and concentrate on actively observing the candidate/s assigned to you.


Whilst it is important that you are observing the candidates behaviour in any simulation exercise it is critical that you record everything you see clearly.  It is this record of behaviour which will form the basis of your assessment and enable you to ‘rate’ an individual’s performance on the exercise. 

It is critical that whilst you are observing and recording information from an exercise you withhold judgement of the behaviour seen.  You must only take notes on what you have observed and attach no value judgements to what have you have seen.  (during the 1 to 1 exercise with the assessor you may observe that the candidate consistently over talked you). To suggest that this upset you as the assessor would be an evaluation of the situation.


Having observed and effectively recorded the information presented in an exercise the next task is to classify the information appropriately against the relevant competencies. 

In a presentation exercise:

‘Jill spoke clearly and concisely throughout the presentation and on several occasions checked the understanding of the observer by summarising and restating what had been said.  Her proposal did not appear to have taken consideration of all the relevant factors making several assumptions without checking….’


Having observed, recorded and classified evidence gathered from an exercise the final stage for the assessor is to evaluate the information and assign a rating for each of the competencies being measured. 

It is important to consider the strength and quality of the information gained.  Behavioural evidence can be weighted by underlining strong evidence of a particular behaviour or bracketing () weak behavioural examples.  This will help you distinguish between the information gathered and enable you make an effective accurate assessment.


The Paint Pot Analogy

For some competencies you will see either a presence of behaviour or an absence of behaviour.  FIt is helpful to think of an empty paint pot.  The pot represents the amount of opportunity the simulation offers to show creativity.  The amount of paint in the pot represents the amount of behaviour you observed against the opportunities offered.

The Scales Analogy

Here evaluation is based on a process of weighing up the positive behaviours versus the negative behaviours to see the balance achieved.

As with the paint pot model, there is a degree of judgement involved in this approach. 


You should write a summary of the evidence gained for each competency using the summary forms provided and assign a rating to each competency assessed.  There are a range of different types of rating scales.


·         Just like me: Assessors often fall into the trap of positively rating candidates who are “like” them in a subjective way.  Try to assess the described competencies – not your own subjective criteria.

·         First impressions and Recency effect: The tendency to be influenced by the first impression the candidate gives and not to base your judgement on all the evidence provided throughout the exercise.  The recency effect is the tendency to be influenced by the last impression you are left with from a candidate rather than to assess earlier evidence shown by the candidate.


The basic principle behind Competency Based Interviewing (CBI) is that:


CBI is a key part of the selection process because it enables you to assess the extent to which a candidate is able to use the core competencies we know to be associated with superior performance in a role.

CAR is a structured approach that focuses on the candidate’s recent experience to identify situations that would have enabled them to use these core competencies. By:

  • Understanding the context that the candidate was responding to
  • Identifying the actions taken by the candidate
  • Establishing the consequences of these actions in terms of results, outputs or impact on the business

The interviewer is able to assess the effectiveness of the behaviour demonstrated by the candidate.

There are therefore 3 critical stages in the process:


The CAR process begins by asking candidates to use examples from their recent experience to respond to set questions.  Every example should contain information that lets the interviewer know “why” something took place.  “Why” can be answered by finding out:

  • The circumstances surrounding the example they are going to talk about

  • The nature of the situation that they found themselves in
  • The task or activity that they had to do in response


Actions are what people say or do to complete a task or activity.

Without further prompting, candidates frequently explain their Actions in general terms such as:

  • “I completed the project”

  • “I put together a training programme for a colleague”

  • “I always fixed the equipment when it broke down”
  • “I helped my colleagues”

All of these are vague and leave you as the interviewer not knowing what was actually said or done – their Actions.


In all those situations you as an interviewer must ask prompting or probing questions to get down to a level of detail that enables you to understand what the individual:

  • Said or
  • Did

The Result is the outcome of the Actions that an individual has taken.  Knowing the Result allows the interviewer to evaluate the effectiveness of the Actions taken and the behaviours used.


The interview format is designed to guide you through the 3 stages and provide space on the script for you to record in note form the evidence or information you have collected.


To ensure that you understand what a behavioural example looks like it is just as important to be able to identify what is not a clear behavioural example.  These are incomplete CAR’s and provide no more than indications of what somebody:

  • Feels or thinks about a situation that they were faced with (Feelings and Opinions)
  • Believes they would do in certain circumstances (Theoretical and Future Related)
  • In general terms has done in the past (Vague and Generalised)

None of these are specific enough to help you make a sound evaluation of their past behaviour.  Some examples:

Feelings and Opinions

“I thought I was the best VDU operator and deserved more responsibility”

“I was bored with processing lots of data all day every day”

“I really enjoyed my job working with customers; I’ve always been told that I am very good at listening to people’s concerns and issues”

These responses provide no insight into an individual’s behaviour and what they did.  Their answers relate to how they felt about a situation that they were faced with.

Theoretical or Future Orientated Responses

“If I had been dealing with the problem I would have done it differently”

“I am always good at dealing with clients on the phone”

“I won’t make a lot of mistakes if I am given a lot of work to process through the system”

These responses indicate what an individual thinks they should do or would do if they were faced with that situation.

Vague and Generalised Responses

“I tend to deal with staff problems when they occur and sort them out then”

“…..and I managed to persuade the client that it was a good deal and they bought the product”

“My approach to disciplinary issues is generally quite tough although I will listen to what others have to say”

These sorts of responses are summaries or descriptions of what they did.  The interviewer needs to get more detail to understand the sort of behaviours that the individual did to resolve the situation they are reflecting back to you.


  • Vague or generalised info.
  • Not all elements of the CAR have been provided.
  • You have not got sufficient information to make a judgement regarding the CAR.

The structure of the questions are designed to help provide the 3 elements of the CAR.

The recording of the information within the interview guide is also in a CAR format:


So that:

  • information is easy to collect and reminds the interviewer of the CAR format
  • standardised way of summarising the information collected
  • enables consistency of approach and assessment






















These are clear indications of a candidate’s intention to use vague, non-behavioural language:

For example, vague responses often begin with phrases like:

  • I am always doing….
  • In those situations, I usually…
  • Most of the time, I would….
  • Usually, I….
  • Sometimes, I….
  • Normally I….

Some questions to focus their responses more appropriately are as follows:

  • Exactly how were you involved?
  • Tell me exactly what you did.
  • Take me through the process, step by step, explaining what you did.
  • Describe the specific situation and the actions.
  • How many times in the last months have you…?
  • Describe one particular time when…
  • Think of one particular day when this happened.
  • Give me more specific detail.
  • Give me a specific example of this situation.
  • How were you personally involved?

Focusing responses more appropriately

Phrases like the following show the interviewer that a candidate is about to speculate about future behaviour or give a theoretical response:

  • Next time that happens, I’ll…..
  • I probably could….
  • I may change the way I…
  • I think now I’m able to…….

Some questions to focus their responses

  • In the past, can you recall an example of…?
  • The last time that happened, what did you…?
  • What has been …?
  • Can you give me a specific example of when this actually happened in the past?

Focusing responses more appropriately

These phrases can give the interviewer a clue that a candidate is about to express a feeling or opinion about a situation:

  • I was really good at….
  • I did more than my share
  • I thought I was….
  • If you ask me, I’d say….
  • If I had been the Manager, I would have….

Some questions to focus their responses more appropriately are as follows:

  • Give me an example of what you did as a result of feeling like this
  • How do you know that?
  • How can you measure that?
  • What actions have you taken because of your feelings?

Follow up questioning is critical because:

  • Most responses to questions will not contain all the aspects of the CAR.
  • The interviewer must determine which parts of the CAR are missing and ask questions that will pin down these missing elements.

Focusing people’s responses to tell you what you need to know can be helped greatly by using appropriate non-leading open questions.  We can illustrate the difference between leading and non-leading (open) follow-up questions.


Leading Questions Non Leading (Open) Questions
I suppose your boss was pleased with that? What feedback did you get from your boss?
You didn’t let her get away with breaking the rules, did you? What did you do when you discovered she was breaking the rules?
When you served the customer I assume you used his name? What did you say to that customer when you served him?
You left that job for more money? Why did you leave that job?

More Examples

Ensure that you avoid using open theoretical questions.  The following examples illustrate the difference between theoretical follow-up questions and non-theoretical follow-up questions.

Theoretical Questions Non-Theoretical Questions
How should you have tackled that problem? How did you actually tackle that problem?
With hindsight what would your approach be next time? What was your specific approach On that occasion?
How do you go about delegating to staff? Tell me how you delegated that task to that member of staff?


  • Asking follow-up questions helps the interviewer collect enough complete behavioural examples to understand the candidate’s typical past work behaviour
  • Asking follow-up questions reduces candidate “faking”

Our Assessment Centre Design


The interviewer has:

  • An interview Guide which provides structural behavioural questions that ensure the interview will focus on relevant information
  • Follow-up questions which ensure collection of the CAR

These tools are used to collect sufficient examples of past behaviour and thereby provide an accurate picture of a candidate’s past experiences and accomplishments.

To use follow-up questioning effectively, the interviewer must first analyse a candidate’s response and categorise it as providing:

  • An incomplete CAR response
  • A partial behavioural example
  • A complete behavioural example

Once a response has been classified, the appropriate follow-up technique can be applied.  The two follow-up techniques are:

  • Follow up to redirect the candidate to a behavioural example
  • Follow up to collect the CAR

To formulate an appropriate follow-up question, the interviewer must:

  • Ask an open, non-leading question
  • Avoid theoretical questions


Interviewers should aim to get the best out of every candidate and need to:

  • Adopt an encouraging and supportive manner
  • Be discerning and challenging.

  • Demonstrate, both by listening and looking attentively, that they are interested in the candidate
  • Create an appropriate atmosphere in which candidates will relax and talk more freely, perhaps more freely than they had intended.


There are some core skills that need to be taken on board to make the most of the structured competency based interview approach.  These fall into various categories, namely:

  • Interviewers Non Verbal Behaviour
  • Questioning Techniques
  • Maintaining Candidate’s Self Confidence
  • Barriers to Effective Interviewing – Being Aware of and Ensuring that they are not demonstrated to Candidates

Our Assessment Centre Design



  • On the basis of a number of research findings it is considered that a distance of from three to five feet is best suited to a selection interview.  Less than three feet seems to produce discomfort and uneasiness for most people, while more than five feet becomes overly formal.

  • As distance increases, perception of the participants by each other is more negative.


  • How the interviewer sits in their chair is also important for demonstrating attention and a real interest.

  • The interviewer’s body needs to be directly orientated towards the candidate.

  • The interviewer should sit up and slightly lean forward in order to show energy and a concern to get on with the task in hand, efficiently and productively.

Eye Contact

  • Eye contact is essential for showing interest but also for showing ease and lack of embarrassment, a willingness to face up to what the candidate is trying to communicate.

  • Eye contact indicates when the interviewee has finished talking and interviewers who do not look sufficiently at the candidate will find it harder to control the interview comfortably.

Facial Expression

  • Facial expression should indicate interest in what the candidate is saying.

  • Often when listening our facial expression appears blank!  Try to ensure the expression conveys interest and avoid showing boredom, irritation or disbelief.

Head Movements

  • Head nodding can provide messages that as the interviewer you are keeping abreast of what is being said, that such information is useful and that you would like to hear more of it.


  • It is necessary for the interviewer to come across as calm and confident in what they are doing and gestures can help or hinder this.

  • Clenched hands and entwined legs can give the impression of nervousness.

  • Hands can be used both to give a greater emphasis to the interviewer’s questions .

Your Voice

  • Encouraging or discouraging messages can come from the way words are produced.

  • Nervousness and under-confidence in the interviewer can come across by talking too quickly or too slowly – calmness by having a steady, reasonably energetic pace.
  • Disinterest could be the message received if the tone of the interviewer’s voice is monotonous without different levels of pitch.
  • Criticism and judgement, which within the interview can be disabling, should not be conveyed by the interviewer’s tone of voice.

Assessment Centre Design

Verbal Behaviour

  • Restating and summarising shows the interviewer is intent on getting an accurate picture of what the candidate is saying.
  • Perceptive probing and use of follow-up questions again reinforces the interviewer’s interest.


The usual types of effective questions apply to competency based interviewing.

Open Questions

  • Open questions often begin with who? what? which? where? why? or how?
  • Open questions are often used to start off a new topic or subject and indicate to the interviewee that they are expected to do the talking.

Probing Questions

  • These sorts of questions are designed to search for information in greater depth.
  • They are vital for detail and for focussing the candidate and interview on particular areas. 


The interview script contains all the information and questions needed to conduct a fair and accurate interview.  It provides:

  1. Notes on preparing for and opening the interview
  • Structured Behavioural Questions which help the interviewer collect complete information on an candidate’s past jobs and experiences
  • Notes on closing the interview and rating the candidate.

The Interview Script provides ample space for an interviewer’s notes.  These notes document the interview. 

Assessment Centre Design


The Interview Script will contain:

Cover Page

  • Provides space for entering candidate’s name, interviewer’s name, the date and any other necessary information

Opening Page

  • Provides a checklist for preparing for the interview
  • Provides an outline for opening the interview
  • Outlines the process and the information that you need to share with all the candidates

Structured Behavioural Question Section

  • Contains questions which direct a candidate to discuss job-related behaviour around target competencies

Guidelines for Ending the Interview

  • Reminds the interviewer to close the interview in a friendly manner


  • Lack of concentration:.
  • Stereotyping: prejudging behaviour on the basis of a similar previous experience.
  • ‘Halo’/’horns’ effect: generalising across all competencies positively or negatively on the basis of one characteristic.
  • Primacy effect: an impression made at the beginning of the interview can overshadow conflicting information observed later.
  • Recency effect. The opposite of the above primacy effect.
  • Attribution theory. It is important that observers take account of situational factors affecting behaviour.

Subjective stance:

We do not like to have our own ideas, prejudices and point of view overturned.  Nor do we like to have our opinions and judgement challenged.  Consequently, when a candidate says something that clashes with what we think, believe and hold firm to, then we may unconsciously stop listening and plan a counter-attack in our own minds.  Similarly a candidate might use a word that triggers an unjustifiably negative or positive reaction.

The most common mistake made in recording is to start to evaluate whilst we are interviewing. 

  • Vague: notes must be accurate and incisive.



  • The purpose of the meeting is to provide an opportunity to share these ratings and the evidence gathered with the other assessors.
  • Before the integration meeting, or ‘wash-up’ you will have observed and rated behaviour independently across a range of exercises.
  • The meeting allows the group of assessors to come to a common, agreed view of the candidate’s performance in the centre.
  • It is essential that the data gathered provides sufficient detail to allow you to provide evidence for the ratings you have given.
  • The integration sessions at the  comprise each pair of assessors, with the centre manager on hand.

Our Assessment Centre Design


  •    Candidates will be discussed one at a time and competency by competency.
  •    The ratings that feed into the particular competency being discussed will be considered in order.
  •    You will be lead by the PSL Centre Manager and may be asked to read out your rating for an exercise and the evidence supporting that rating for a particular candidate on a particular competency, e.g. if People Skills (Influencing) is being discussed for a particular candidate, and you assessed their interview, you will be asked to provide this rating and evidence.


  •    The assessors who marked the candidate on the same competency in another exercises may also have to read out their ratings and evidence.
  •    You will need to listen to this and discuss any discrepancies you see.  It is important that you listen carefully during this process and question each other with regard to evidence provided.
  •    Only once this has been discussed fully can overall rating for the particular competency be assigned.  The PSL Centre Manager will control this process and try to ensure that only exceptions are discussed or candidates where inconsistencies have emerged – time is of the essence here.

Some common pitfalls to avoid whilst going through the integration are discussed below:

“Assessment Centre Mathematics”

Try to avoid getting too caught up in the numbers being displayed for performance in particular competencies.  Remember that these numbers simply reflect behavioural evidence.  It may be that on assessor gives a candidate a rating of 3 for a particular competency whilst another gives them a 4.  Remember that different exercises may provide differing opportunities to display certain behavioural indicators for a competency.  Always ensure that you go back over the evidence in detail to see the extent to which the candidate has managed to cover the competency definition and indicators across the different exercises.

Conflicting Data & ‘Primary/Secondary’ Evidence:

Sometimes you may find that different exercises will provide you with completely conflicting evidence.  When this is the case make sure you not only check the extent to which the competency definition has been met, but also check which exercise provides the primary source of evidence.   The general rule should follow that when data is conflicting the overall rating should follow that of the primary source of evidence.  For instance an in tray will provide primary evidence of ‘Analysing Information’ as the behaviour is being measure directly.  You may also be able to gain information about an individual’s preferred style of analysis through a personality questionnaire.  However, as this is the candidate own view of them self and not measured directly in an exercise this would be a secondary source of evidence.  To this end the in tray rating would carry more weight.

Rating Not Matching the Evidence

Always listen carefully to the evidence being presented and ensure that the rating given appears to reflect the behavioural evidence provided.  It is important that assessors are open to discussing and debating the ratings they have given to candidates they have assessed.  Final ratings should be based upon a consensus of agreement among the group of assessors.  Care should be taken during these discussions to always reflect upon the competency definitions provided.  In this way you can ensure that the assessments being made are fair and objective and reflect the requirements of the job.


The Integration matrix is similar to the assessment matrix but designed for just one candidate/participant.  An integration matrix needs to be completed for each candidate attending a centre in order to capture all the ratings that have been given for that particular individual.  It is useful to put the matrix on acetates for use in the integration session so everyone in the meeting can see the rating being discussed.


In this section, we look at the practical issues in running your assessment centres.

Our Assessment Centre Design


You will need to take account of a number of factors when deciding upon a

  • Availability of assessors and candidates – What time commitments are plausible from these groups of people.
  • Numbers – how many candidates, assessors, administrators etc are likely to be involved?  We recommend that you design centres to run with either 6 or 12 candidates, and assume a candidate: assessor ratio of 2:1.
  • Venue – What rooms are available?  What is the layout of these rooms?  Remember that the assessors will need a ‘bunker’ room for write-ups etc.
  • Exercises – How long are the exercises?  Remember to include administration and scoring time into the timetable.


The assessor manual should include copies of all the exercises, but can also form the course notes for assessor training.  A manual might include some of the areas covered within this manual.

In addition, the following materials will need to be made available for the assessment centre:

  • Exercise booklets for candidates and assessors
  • Observation forms, scoring guidelines and summary sheets
  • Any test materials
  • Pens, pencils, erasers, rough paper, calculators and stop watches.
  • Candidate Folders (1 per candidate)
  • Name plates / badges

Our Assessment Centre Design


The assessor pool should ideally comprise of line managers from a range of functions together with professional assessors.  In addition, some assessors and centre administrators may be HR specialists.

Assessor training is crucial, and workshops typically last between 1 and 2 days.  However, this may be dictated by the time available for assessors to attend the course.


All candidates need to know what to expect from an assessment centre.  The briefing should be friendly and clear.

Essentially, the briefing should cover a number of key points:

  • What is an assessment centre and why are they used?
  • What will happen on the day?
  • How is candidate performance assessed?
  • Do assessment centres actually work?
  • How can I prepare?
  • What feedback will I receive and what happens to my data?

Our Assessment Centre Design


The Centre Manager will have received instructions to monitor the quality of the assessor’s work, and will offer assistance and immediate feedback as necessary.

The Centre Manager is running the centres and will be providing guidance, direction, coaching and support as well as ensuring that the timetable is strictly adhered to.

Time is critical and every minute must be effectively utilised as there is no scope for falling behind in terms of your assessor responsibilities for assessment, rating and summaries in terms of comments.

Psychometric Assessment Practice for Adults

Please mail your specific psychometric test coaching needs at:

Numerical reasoning assessment tips .

Non-verbal reasoning assessment tips. 

Literacy assessment practice.

Abstract reasoning assessment practice.

Our Assessment Centre Design

Prepare for Assessments book reviews

5.0 out of 5 stars. Very Good Quality.

The cost of this book is definitely worth the outlay when yoiu have a interview coming up. I really like the examples and thought it was close to real test I took.

4 people found this helpful.

5.0 out of 5 stars. I got the job.

Thank you Robert Williams for helping me prepare for the ability tests. I felt calm and tried my best. Got the job and so feel the tenner for this book was well spent.

5.0 out of 5 stars. This book is for people who are about to sit interviews

Robert Williams guides the reader through the complexities of tests given by employers to prospective candidates. Lots of sensible sound advice and examples to work through make this book a useful and readable tool.6 people found this helpful .

Our Assessment Centre Design

Brilliant Passing Verbal Reasoning Test book

Everything you need to know to pass verbal reasoning tests (Brilliant Business) is available on Amazon. It has gained several 4-star and 5-star reviews.

Amazon Customer reviews for Brilliant Verbal Reasoning book.

3.7 out of 5 stars.

Passing Numerical reasoning Tests book

Buy this book to pass!

Brilliant book. Read it from cover to cover. Don’t skip the chapters. I know it can seem patronising, the way he breaks it down, but I’d recommend reading it through.

4.0 out of 5 stars.

Our Assessment Centre Design

Quantitative Reasoning Assessment Practice

Although you may not finish the test, the best strategy is to answer as many questions as you can in the time available.

  • Firstly, before deciding upon your final answer. You may be able to rule out one or two of the multiple choice questions as incorrect.
  • Secondly, read each question and also review each chart very carefully. Take one chart and its associated questions at a time. Only start looking at the answer options once you have done this.
  • Ensure that you are also aware of the units of measurement that each question is referring to.
  • Each question is worth the same so don’t spend too long on a single question. So, remember that you may find subsequent questions easier to answer. If there is time at the end of the test you can return to any unfinished questions.
  • Work efficiently, but do not rush. You may not finish the test. However, the best strategy is to answer as many questions as you can in the time available.
  • Remember to only use the information that is provided in the charts. Do not use any of your own background knowledge.
  • Lastly, round up any decimal points and any pence.

Our Assessment Centre Design

Passing numerical / verbal reasoning assessments

 Free online numerical reasoning test practice / free online verbal reasoning test practice

Passing numerical reasoning assessments

There aren’t any quick wins for being good at maths but some focussed practice will improve your score, as will following a few test-taking strategies.

As a timed assessment, you need to average around one minute per question. Work briskly but accurately. Each question counts the same so pick off the easy ones first and don’t waste your test time on the most difficult questions.

Numerical reasoning test practice is an excellent means of brushing-up on any maths functions you haven’t used in a while. Ensure that you are comfortable using data tables, interpreting graphs and manipulating large financial figures.

You can practise the most common numerical test types at the main test publisher websites. Practise sample questions from Kenexa-IBM, TalentQ and SHL as these sites cover most of the tests you are likely to find.

Passing verbal reasoning assessments 

Verbal reasoning assessments come in many different types of format.

The traditional comprehension format is to have a short passage followed by a series of questions – asking about facts, opinions, and conclusions – based on its content, a bit like those English tests in primary school where you answered questions on a novel extract.

Regardless of the type of test, it’s vital to carefully read each question. Often questions hinge on one or two key words, so you must take more care to interpret these accurately. If questioned whether something “always” applies whilst the passage states that it is “sometimes” the case, then this is a false interpretation.

Scan the passage initially and then read it in more detail. It’s easier to answer each question if you can recall roughly where to find the answer in the text.

Passing abstract reasoning assessments

These ask you to look for the changing pattern(s) in the “pictures”.

The easier questions typically appear at the start of the assessment and will involve one change in colour, position, size etc .of the figures shown.

Questions become more difficult as you progress and must spot two or three changes in any of the features shown. Once you’ve worked out at least one of the feature changes, check through the answer options to discount those that do not conform.

Passing personality assessments

When it comes to answering psychometric surveys that evaluate personality, the best advice is to give your “first response”.

Visualise how you would behave at work on a typical good day. Don’t second guess what is being looked for since “faking” and lying are easily picked up.

Like anything, practice makes perfect. And don’t be afraid to ask the employer which publisher’s tests they use – most will be happy to tell you.

Being familiar with the format, as well as the kinds of questions asked, will give you a clear advantage. On the day, keep calm and remember that most assessments are timed, so answer the questions as swiftly as you can.

What do psychometric assessments measure?

Ability tests can assess a specific ability (Verbal, Numerical, Spatial) or more general level of reasoning (Abstract, Non-Verbal).

Personality assessments broadly divide into those which assess personality traits (16PF) and those, which assess personality types (Myers Briggs). This difference relates back to the personality theory on which the test is based.

Free psychometric assessment practice

Free Aptitude Test Practice

Our Assessment Centre Design