Volume 55, pp. 142-168, 2022.

Sparse mixture models inspired by ANOVA decompositions

Johannes Hertrich, Fatima Antarou Ba, and Gabriele Steidl

Abstract

Inspired by the analysis of variance (ANOVA) decomposition of functions, we propose a Gaussian-uniform mixture model on the high-dimensional torus which relies on the assumption that the function that we wish to approximate can be well explained by limited variable interactions. We consider three model approaches, namely wrapped Gaussians, diagonal wrapped Gaussians, and products of von Mises distributions. The sparsity of the mixture model is ensured by the fact that its summands are products of Gaussian-like density functions acting on low-dimensional spaces and uniform probability densities defined on the remaining directions. To learn such a sparse mixture model from given samples, we propose an objective function consisting of the negative log-likelihood function of the mixture model and a regularizer that penalizes the number of its summands. For minimizing this functional we combine the Expectation Maximization algorithm with a proximal step that takes the regularizer into account. To decide which summands of the mixture model are important, we apply a Kolmogorov-Smirnov test. Numerical examples demonstrate the performance of our approach.

Full Text (PDF) [487 KB], BibTeX

Key words

sparse mixture models, ANOVA decomposition, wrapped Gaussian distribution, von Mises distribution, approximation of high-dimensional probability density functions, Kolmogorov-Smirnov test

AMS subject classifications

62H30, 62H12, 65D15, 65C60, 62H10

< Back