Weights estimation in mixture models with entropic optimal transport.
Abstract
While optimal transport allows to compare probability measures with different structures, its computational cost has limited its use in practice. To reduce the algorithmic cost, M.Cuturi proposed in 2013 to regularize the optimal transport problem with the addition of an entropic penalty on the transport plan. The solution of this regularized minimization problem, called entropic optimal transport, defines an alternative way of comparing probability measures. In this talk, through the study of a mixture model parameterized by its weights, I will discuss the statistical effect of regularizing the transport plan. For a chosen regularization parameter, the estimator we consider is defined as the minimum of loss function involving the entropic optimal transport cost. In this framework, we have a collection of estimators, where each weights estimator corresponds to a fixed regularization parameter. Of particular interest for us will be the impact of this regularization parameter on the performance of a weights estimators. We derive upper bounds on the expected excess risks of the estimators considered. From these results we derive an automatic choice of the regularization parameter. As a by product of our analysis, we also propose to automatically choose the number of iterations for the algorithm used to approximate the entropic optimal transport.
This talk is based on the preprint https://arxiv.org/abs/2210.06934 ; which is a joint work with Jérémie Bigot, Boris Hejblum and Arthur Leclaire.