google analytics alternative
01793 640204
Email Us : This email address is being protected from spambots. You need JavaScript enabled to view it.


Scoring Guidelines for On-line Skills Tests & Assessments


Field Recruitment collects data from hundreds of companies using the on-line skills tests and assessments in order to formulate general scoring benchmarks. While these numbers have nothing to do with cut-rate scores or validation, and should not be used in such context, you can refer to them as scoring guidelines until you have enough experience working with this assessment system and your candidates to determine acceptable scores for your organisation. How candidates' results are interpreted will vary depending on the purpose of testing. Therefore, it is possible that every position within a company would have different levels of acceptable scores. Should you wish to determine cut-rates for our assessments, it is your responsibility to set up a separate study in a position-specific context in order to comply with EEOC guidelines.

Generally, a score of 60%-80% indicates a basic knowledge of the subject being tested with scores above 80% indicating an advanced knowledge of the subject being tested. However, simply because a candidate scores less than 60%, it does not follow that the score is unacceptable. A score of less than 60% may be acceptable if the candidate is not required to have mastered all of the material tested; for example, most legal secretaries do not use the mail merge feature of WordPerfect in their job duties and tend to answer these questions incorrectly. Hence, an overall lower score resulting from incorrect answers to mail merge questions does not indicate that the candidate does not have those skills required by the position.

In addition, many assessments are optimised for users with two or more years of experience. A candidate with one year of experience may score well under 60%. For this reason, your assessment score reports include question-by-question results. To ensure that you are using all assessments in a valid manner, please check individual question results and compare them to your needs. For instance, if you are placing someone in a position that requires Lotus skills but does not require Lotus graphing skills, then you should not include those assessment questions on the assessment to disqualify an applicant from a job.

You may encounter assessment results that are unusually low. If most of your candidates score between 60% and 80%, an unusually low score might be 20%. In our experience, there are several reasons that this might occur. Poor scores may result if the candidate fails to follow directions, is distracted, is tired, does not have a grasp of the material presented, or leaves the assessment before it is completed. For best results, make certain the candidate reads the questions carefully and performs all of the keystrokes that correspond exactly to the task required.


Interpreting Customer Service Mindset Survey Results
The following suggested scores are based on the number of points scored, not the percent correct:
  • 125 or better - outstanding
  • 115-124 - very good
  • 100-114 - good
  • 80-99 - fair
  • 79 or less - poor

 Algorithm:  Example
 Weight-one answers = X
 Weight-two answers = Y
 Total Score = Z

 X + (Y*2) = Z

 Weight-one answers: 12
 Weight-two answers: 57
 Total Score: 126

 12 + (57*2) = 126

Interpreting Corrective Proofreading Results
These assessments are designed to measure the ability to recognize and correct errors within assessment passages.

Assessment takers are evaluated on the following items:
  • Number of errors successfully identified in relation to the number of possible errors
  • Number of errors not identified
  • Number of identified errors that are successfully corrected
  • Number of identified errors that are not successfully corrected
  • Number of correct items that are made incorrect by the assessment taker

 Algorithm:  Example
 Identified Errors = A out of B
 Raw Score = C
 Misjudged Errors = D
 Adjusted Score = F

 (A/B) x 100 = C
 A - D = Total Adjusted Mistakes (TAM)
 (TAM/B) x 100 = F

 Identified Errors = 15 out of 15
 Raw Score = 100
 Misjudged Errors = 2
 Adjusted Score = 87%

 (15/15) x 100 = 100
 15 - 2 = Total Adjusted Mistakes (TAM)
 (13/15) x 100 = 87%

Interpreting Data Entry Results

The data entry assessments measure the speed and accuracy of data entry. Mistakes that are counted include:

  • Added words and phrases
  • Capitalization
  • Consecutive words that have no spacing between them
  • Duplicate words and phrases
  • Misspellings
  • More than 1 space between two words
  • Punctuation/Specific Formatting errors
  • Skipped words or phrases

We suggest utilising both Field Accuracy and Adjusted Keystrokes per Hour as a basis for comparison. If two people typed 6500 adjusted keystrokes per hour and one has a 78% field accuracy compared to another individual who has a 92% field accuracy, you see the notable difference between the two. On the other side, if you had two people with 98% field accuracy, but one person typed 9000 adjusted keystrokes per hour and the other typed 7500 keystrokes per hour, you also see the measurable difference.

It has been our experience that those customers that utilize the Data Entry Assessment as an Accuracy measurement only and don't consider how fast the individual is, utilise the Field Accuracy % instead of Keystrokes Accuracy %. There are so many characters within the assessment that even if there are character errors made within an assessment, their Keystroke Accuracy % would still be high because it is compared to the rest of the hundreds of characters. Whereas, if you make one mistake, based on Field Accuracy, the impact is immediately noticeable. We include the Keystroke Accuracy because its main job is to calculate the Adjusted Keystrokes per hour score, which takes both keystroke accuracy and speed into consideration.

  • Elapsed Time: The amount of time spent on the assessments; the elapsed time excludes any extra time required by someone utilising a slow internet connection, so that each assessment taker is measured equally.
  • Field Accuracy Percentage: If there is a minimum of one error within each field, the field is marked as incorrect and the field accuracy percentage calculates the number of possible fields by the number of fields entered correctly.
  • KPH: Keystrokes per Hour is the amount of characters typed by the Assessment taker and calculated into keystrokes per hour using the amount of time the assessment taker took; this takes only speed into consideration.
  • Keystrokes Accuracy: Keystroke Accuracy Percentage is the amount of characters typed correctly compared to the possible number of typed characters.
  • Adjusted KPH: The Adjusted Keystrokes per Hour uses the keystroke accuracy percentage and modifies the keystrokes per hour to get the adjusted score; this takes both accuracy and speed into consideration.

The Adjusted Keystrokes per Hours score is calculated using the following:
 Algorithm:  Example
 Keystrokes per Hour = KPH
 Keystrokes Accuracy Percentage = KAP
 Adjusted Keystrokes per Hour = AKPH

 AKPH = (KPH * KAP) / 100

 Keystrokes per Hour = 9972
 Keystrokes Accuracy Percentage = 98%
 Adjusted Keystrokes per Hour = 9772

 9772 = (9972 * 98) / 100

Interpreting Typing Results

The typing assessment measures the speed and accuracy of typing skills. Mistakes that are counted include:

  • Added words or phrases
  • Capitalization
  • Consecutive words that have no spacing between them
  • Duplicate words or phrases
  • Misspellings
  • Punctuation errors
  • Skipped words and phrases
  • Typing Assessment allows for both 1 or 2 spaces after a period

If the assessment taker does not follow the passage closely enough for an accurate score to be determined, the assessment will be scored with all zeroes and the phrase "The user did not follow the sample closely enough for an accurate score to be determined" will precede what the assessment taker typed on the results.

The Adjusted Words per Minute score is calculated using the following function:
Adjusted Words per Minute = Words per Minute - Number of Errors per Minute

Interpreting Writing Sample Scenarios

Because gauging writing skills is a highly subjective endeavor, we leave it to you to determine the level of writing effectiveness represented by the candidate's sample. We do believe though, that providing a controlled, immediate writing environment helps to assess the skills of the writer, in context; it is for this reason that we provide this environment and encourage you to use it, free of charge. Review the passage and see whether it meets the criteria provided in the scenario:

  • Has the Assessment Taker stated their purpose and included an appropriate response to the requested scenario criteria?
  • Has the Assessment Taker presented their ideas clearly and logically?
  • Has the Assessment Taker made errors in spelling, punctuation and/or capitalisation?
  • Review the passage for overall style of letter.

A scored, multiple choice, Business Writing assessment is also available that focuses on such skills as brainstorming, grammar, organisation, and approaches to writing.

Find out how we can help with your recruitment needs


Search job vacancies


help and advise for candidates



Copyright 2018, Field Recruitment Limited (Registered as a UK company no. 564 5374)