Smoothness and monotonicity constraints for neural networks using ICEnet

I am pleased to share a new paper on adding smoothness and monotonicity constraints to neural networks. This is a joint work with Mario Wüthrich.

In this paper, we propose a novel method for enforcing smoothness and monotonicity constraints within deep learning models used for actuarial tasks, such as pricing. The method is called ICEnet, which stands for Individual Conditional Expectation network. It’s based on augmenting the original data with pseudo-data that reflect the structure of the variables that need to be constrained. We show how to design and train the ICEnet using a compound loss function that balances accuracy and constraints, and we provide example applications using real-world datasets. The structure of the ICEnet is shown in the following figure.

Applying the model produces predictions that are smooth and vary with risk factors in line with intuition. Below is an example where applying constraints forces a neural network to produce predictions of claims frequency that increase with population density and vehicle power.

You can read the full paper at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4449030 and we welcome any feedback.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: