The New Guide to SOA Exams

Get answers to your most pressing questions Stuart Klugman

Photo: iStock.com/globalstock

Have you ever wondered how Society of Actuaries (SOA) exams are constructed? Or the difference between a question that asks for a recommendation vs. one that asks for an analysis? Or how pilot questions are used on multiple-choice exams? Or how pass marks are set? Or what goes into a model solution?

The answers to these questions and more are in the new Guide to SOA Exams, now available to access from many of the education webpages at SOA.org. While you will need to read the full guide to get all of the answers, this article will explain why it was produced and reveal one change we have made.

For years, the SOA education webpages contained numerous articles covering various aspects of the candidate experience—from exam development and grading to tips for success and disciplinary procedures. An initial attempt to consolidate information resulted in the previously published Guide to Written-Answer Examinations. That document had an exclusive focus on fellowship written-answer exams, and was later expanded to cover the written-answer portion of what is now the Long-Term Actuarial Mathematics (LTAM) exam.

In 2019, it was decided to add other relevant information to the guide, in particular, coverage of multiple-choice exams and e-Learning assessments. For candidates, those interested in becoming candidates and those who work with candidates, the new Guide provides a single point to access information. While the Guide is designed to be a comprehensive document, it is important to note that some documents must remain separate (though they are described in the Guide). Examples are the Code of Conduct for Candidates and the Confidentiality and Discipline Procedures for Computer-Based Testing for Candidates.

With the publication of the Guide, we are announcing a new way of reporting results for candidates who are unsuccessful on written-answer examinations. Prior to 2020, failing candidates received question-by-question feedback on a 0–10 scale that was meant to be interpreted in the same way as the 0–10 scores given for the entire exam. Such a scale requires a passing score for each question. Because the exam committees do not set pass marks for each question, we had to infer the pass mark for each question. This led to candidate misinterpretations and confusion.

Beginning with the 2020 written-answer exams (both fellowship and the associateship LTAM and Predictive Analytics exams), a new form of reporting will be used. For each question, candidates will be provided the percentile rank of their score. This will make it immediately obvious how a candidate performed relative to others on each question. It is important to note that a weighted average (by exam points) of the percentiles will not yield the overall percentile rating.

We want this document to be as informative and useful as possible. Please feel free to email me with any questions or comments about the Guide.

Stuart Klugman, FSA, CERA, is senior staff fellow, Education, at the Society of Actuaries.

Copyright © 2020 by the Society of Actuaries, Chicago, Illinois.