Uncertainty quantification with compound density network (Workshop-Beitrag)2019
Research Hub C: Sichere Systeme
RC 9: Intelligent Security Systems
Despite the huge success of deep neural networks (NNs), finding good mechanisms for quantifying their prediction uncertainty is still an open problem. Bayesian neural networks are one of the most popular approaches to uncertainty quantification. On the other hand, it was recently shown that ensembles of NNs, which belong to the class of mixture models, can also be used to quantify prediction uncertainty. In this paper, we enhance the flexibility of mixture models by replacing the fixed mixing weights by an adaptive, input-dependent distribution represented by NNs, and by considering uncountable many mixture components. The resulting class of models can be seen as the continuous counterpart to mixture density networks and is therefore referred to as compound density networks (CDNs). We employ likelihood maximization to train CDNs, and empirically show that they yield better uncertainty estimates on out-of-distribution data and are more robust to adversarial examples than previous approaches.