Skip to content

Maren Raus, Yara Elshiaty, Stefania Petra, Accelerated Bregman divergence optimization with SMART: An information geometric point of view

Full Text: PDF
DOI: 10.23952/jano.6.2024.1.01
Volume 6, Issue 1, 1 April 2024, Pages 1-40

 

Abstract. We investigate the problem of minimizing Kullback-Leibler divergence between a linear model Ax and a positive vector b in different convex domains (positive orthant, n-dimensional box, probability simplex). Our focus is on the SMART method that employs efficient multiplicative updates. We explore the exponentiated gradient method, which can be viewed as a Bregman proximal gradient method and as a Riemannian gradient descent on the parameter manifold of a corresponding distribution of the exponential family. This dual interpretation enables us to establish connections and achieve accelerated SMART iterates while smoothly incorporating constraints. The performance of the proposed acceleration schemes is demonstrated by large-scale numerical examples.

 

How to Cite this Article:
M. Raus, Y. Elshiaty, S. Petra, Accelerated Bregman divergence optimization with SMART: An information geometric point of view, J. Appl. Numer. Optim. 6 (2024), 1-40.