AI in Actuarial Science

The CEO of Lemonade, Dan Schreiber, made the statement in a recent talk that:

“The future of insurance will be staffed by bots rather than brokers and AI in favor of actuaries.”

Despite sounding rather impressive, the Youtube video of the rest of Dan’s talk doesn’t really go into much detail to explain his thinking. That being said, this statement seems to be predicated on the view that actuarial science won’t evolve to embrace AI and bring the tools of modern statistical and machine learning into the everyday practice of actuaries. I personally think this view is inaccurate, and that actuaries practicing in “the future” will just as easily turn to a machine learning algorithm as they will to the traditional tools of the trade. Together with the domain specific knowledge of insurance that actuarial training brings, I think this will be a powerful combination that will serve the insurance industry well.

One reason I feel comfortable making this prediction is that the actuarial literature is starting to examine AI and machine learning, and how it can be applied to traditional actuarial problems. Many of the best examples that I have seen are from Professor Mario Wüthrich (who, together with his colleagues, is credited with the thinking and formula behind the Solvency II Reserving Risk formula). Some of his work includes applying deep neural networks to telematics data and machine learning approaches to the problem of IBNR reserving. Other recent papers include one applying deep auto-encoders to analyse population mortality in a Lee-Carter setup and another examining gradient boosted Tweedie models for pricing.

I plan to examine some of these new ideas in actuarial science on this blog in the next couple of posts and also provide code in R and Python on my Github account that will allow anyone who is interested to see how to apply these ideas practically. First up will be auto-encoders, which are a form of dimensionality reduction used to summarize high dimensional data in a low dimensional form. As an example of high dimensional data, think of a life table that has entries for the mortality rate for each age in the table. A life table with rates up to age 110 can be thought of a 110 dimensional vector. Although it is not a new idea to summarize a life table with a only a few parameters (the mortality laws, the Lee-Carter model using SVD and Brass’ Logit Transform spring to mind), recent work has shown that neural networks can estimate these summaries more accurately than traditional methods.

I also plan to build a section on my website that acts as a guide to this emerging field of actuarial science that I hope will be useful to other actuaries (and professionals) who want to understand how AI and machine learning can be applied to actuarial problems. As a start, some of the excellent material from Prof Wüthrich is available on his website.

Insurance seems to have become an exciting industry to work in these days and I am equally excited about the opportunities that lie ahead for actuaries and other insurance professionals.

 

 

 

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: