A good place to start is the Blooms Taxonomy, a system used to categorise learning based on complexity. Blooms considers learning at 6 levels: Remembering, Understanding, Applying, Analysing, Evaluating and Creating.  Depending on the content of the certification, it is necessary to align the assessment to the appropriate level of Blooms.

Most professional certifications adopt a multiple-choice exam usually comprising a stem (question) with 3 or more answers, of which only 1 is correct. The reason for adopting this approach is they are usually easy to answer, and easy to mark, and a lot of people think they are easy to write.  

Within APMG we have spent thousands of hours refining the process. And we think we have got a pretty good recipe for getting it right.  Our approach is as follows:-

1. Agree the scope of assessment. 

  • This is often quite hard as if it’s too narrow, the assessment becomes very niche and not many people are interested. If it is too high level, aimed at experienced people they ask why do they need it, so again the market is very niche and if it is too broad and it takes too long to learn, very few people will commit the time and effort.  
  • As with many projects and programmes scoping is critical. Get it right and it can lead to success. 

2. Determine or write the definitive text to be examined.

  • If you are going to be examined on something you need to know what constitutes the right answer.  You need a definitive answer.  Clients have come to us with text they would like to base an examination on, in which each section is written by different authors. Multiple authors sometimes answer the same question differently, making it a more interesting read but very hard to examine against. 
  • It needs to cover the full scope of what you want to assess which is why lots of organisations write a definitive Body of Knowledge on which a Certification can be based. 

3. To develop a syllabus.

  • The syllabus will define the examinable material and in particular put emphasis on different parts of that material so different weightings can be applied in the exam to ensure it tests fundamental knowledge to an appropriate depth and breadth.

4. Exam paper design.  

  •  There is no definitive answer for the optimum length of an examination paper, though ideally 80% of the syllabus should be tested within each examination paper. We believe most topics can be fairly assessed within an hour with the number of questions ranging between 40 and 75.  Our experience over the last 25 years shows that this is appropriate to a wide range of subjects testing at Blooms levels 1 and 2.
  • Setting the pass mark is always difficult. A lot of organisations opt for a 50% pass mark which effectively means you are happy for a successful student to only know the answers to half the exam paper.  This may or may not be appropriate. One thing we have found is that once you set the pass mark, it is very difficult to change it so it’s worth testing your approach before going live.

5. Write the questions.

  • It looks simple, but over the years we have actually collected information, commissioned research and from our own experiences, learned how to do it properly. There are some simple guidelines e.g. avoid complex languages and double negatives and to make sure there is consistency between the question and the options. 
  • We also grade our questions so we can test at different levels of knowledge, rather than just the one. 

6. Validate the questions.

  • It is only possible to validate the questions by having the target audience try them. Asking experienced people to check an examination question is pointless as they would know the right answer. We tend to launch with some pilot papers where individuals willingly take a training course and an examination before it is live.  This enables us to make sure the paper is at the right level, which is very important. Considering that in an examination paper of 40 questions, with 4 options, a random selection could result in getting 1 in 4 right so someone could get  25% without even bothering to read the questions.  
  • When we introduce new questions, at times we include them in live examination papers (they are not taken into account in determining the candidate’s mark), so no candidate is disadvantaged, but again it allows us to test the questions with real candidates, ensuring the appropriateness of all live questions in the examination.

7. Validating live questions. 

  • As we gather data from candidates who have taken the exam, we are able to analyse how they have answered each question to see if there are some wrong answers that appear consistently, which ones warrant review of either the training approach or source text, or simply a re-wording to ensure it’s not a misleading question. 
  • Statistical analysis enables us to ensure the papers remain fair, reliable and consistent.

8. Refresh and update. 

  • The feedback from the analysis enables us to continuously review questions and when appropriate, refresh or update them to ensure the examination remains fair and consistent. 

As Steve Jobs once said “Simple can be harder than complex. You have to work hard to get your thinking clean to make it simple, but it’s worth it in the end because once you get there you can move mountains.”

We have certainly put in thousands of hours of effort to make our approach simple, rather than complex. If you are struggling with the same issue, perhaps I could help? Please send me your contact details and an outline of what your goal is and I'll try and help.

The author

Nicola Kelly


Product Innovation Director

Nikki is responsible for developing and expanding the APMG Product Portfolio, bringing on new schemes and considering how APMG can utilise innovations within the examinations and assessment market to maintain APMG's position as the leading global Accreditation and Certification Body. Nikki has led teams in Operations, Quality, Accreditation and Product development, giving her a rounded view of the certification and training marketplace.