Enhanced Assessment Methods Coming to a Test Center Near You

A new two-step process to ensure predictive analytics is properly learned and assessed

STUART KLUGMAN AND MARTHA SIKARAS

One experience that binds Society of Actuaries (SOA) members together is surviving the exam process. As changes in learning methods (e.g., online e-Learning modules) and testing methods (e.g., computer-based tests with immediate results) occur, each generation’s pathway is unique.

Throughout, the SOA has been committed to continuous improvement. The next step is to upgrade our testing environment. To be properly assessed, candidates should have the same tools in front of them that they use at their job. That means computer tools (such as spreadsheets and programs for statistical analysis) and the ability to write reports using a word processor. In turn, this will allow us to ask more complex questions that get closer to real-work situations. In December, the first phase will be introduced—assessment of predictive analytics via a proctored project with access to software. In this article, we explain the need for the change, provide details regarding the assessment and offer some thoughts as to what might come next as we expand this platform to other exams.

In 2016, the SOA Board of Directors approved a proposal to make significant changes to our associateship education. Along with achieving a better balance in coverage of long-term and short-term insurance, there was demand from employers to significantly upgrade education in predictive analytics. This presented a challenge in that being able to work with big data is more than reciting the details of the major models and techniques. It is about visualizing complex data, dimension reduction and feature selection, understanding the bias/variance trade-off, and communicating results. With model building requiring a sequence of complex and interrelated decisions, multiple-choice or short written-answer questions with the assistance of a scientific calculator are clearly insufficient.

A Two-Step Process

To ensure that predictive analytics is properly learned and assessed, the Board approved a two-step process. Candidates must first pass (or have waiver credit via the transition rules) the new Statistics for Risk Modeling Exam. This is a multiple-choice exam to be first offered in September 2018 and then every four months thereafter. It is offered in the same computer-based environment as other preliminary exams. Multiple-choice questions work here because the goal of this exam is to ensure that candidates are familiar with the basic concepts of the major analytics techniques (the generalized linear model, regression-based time series, decision trees, principal components analysis and clustering) as well as model selection and assessment techniques such as cross-validation.

The second step is the Predictive Analytics (PA) Exam. This exam will be first offered on Dec. 13, 2018. At the time of this writing (May 2018), this is what we know:

  • The project will be a realistic analytics assignment where candidates are presented with a data set and a business problem. A report that presents and supports the solution provided is to be prepared. Enough direction will be provided to ensure that the five-hour time limit is reasonable. For example, a problem might be amenable to solution by a generalized linear model or by a regression tree. Candidates might be asked to only investigate a regression tree solution and then write about why a generalized linear model may or may not be a better approach.
  • Candidates will take the exam at a Prometric test center. This is the same network of centers used for our computer-based preliminary examinations. Candidates will have the same experience with regard to registration, check-in and security as they have for their earlier exams. The exceptional security protocols used by Prometric ensure that we have met the Board’s desire that this assessment be proctored using the highest standards of supervision.
  • Candidates will work in a Windows-like environment in which they will have access to Microsoft Word and Excel, a PDF reader (to provide access to those documents the exam committee deems useful for candidates) and RStudio for performing their analyses. Their deliverables will be a Word file with their report and a file with their R code. Candidates will have the option to upload additional files in support of their work.
  • The R statistical computing environment is open source and features hundreds of routines, known as packages. Because more than one package can perform a particular task, we will ensure that candidates will know in advance which packages will be installed on the Prometric computers for their use in completing their analysis.
  • Candidates may bring calculators of the same approved models used for other exams.
  • Candidates will also be provided scratch paper as well as a hard copy of the project statement to use while at the test center.
  • Candidates will have five hours to complete their project. It is anticipated that the timer may be stopped for up to 15 minutes for restroom/snack breaks. As with any SOA exam, additional time may be used for these purposes, but the timer will be running.
  • Candidates will have e-Learning support designed to enhance their Exam SRM studies and to set expectations for what their predictive analytics project will be like—and the SOA’s expectations with regard to a successful submission.
  • Grading of the assessment will follow protocols used for fellowship exams. Graders will have a guide that clearly lays out expectations and how points will be assigned. Papers near the proposed pass mark will be independently graded by a second grader, and the two graders will reconcile any differences. More information can be found in the Guide to SOA Written Exams.

By taking this approach, the SOA will be better able to properly assess which candidates have mastered the fundamental skills related to using predictive analytics to solve real (or, given the time constraint, realistic) problems.

Other Applications for the New Approach

While this platform was designed specifically for the PA Exam, it will open up opportunities to improve existing exams in the future. For example, the Long-term Actuarial Mathematics Exam (the fall 2018 successor to Models for Life Contingencies) is currently a mixture of multiple choice and written answer administered by paper and pencil. If moved to the new platform we are using for predictive analytics, candidates could type rather than write their answers, ensuring that their communications are what they intend. They would also have access to Excel as a calculation tool. Some fellowship exams could be delivered in a similar manner. The profession and employers benefit because we can ask questions that are more relevant to actuarial practice. Solutions to the questions can then be based on tools representative of those used by candidates in their jobs and workplaces. Volunteers benefit because they won’t need to strain to decipher handwriting. Everyone benefits with a more secure method of delivering papers from the test center to the SOA, and then to graders.

We are enthusiastic about this new approach to assessing candidates and look forward to the improvements that will flow from it in the future.

Stuart Klugman, FSA, CERA, is senior staff fellow, Education, at the Society of Actuaries.
Martha Sikaras is director, International Education Programs, at the Society of Actuaries.