Skip to content

Ziyue Hu, Qiao-Li Dong, Xiaolong Qin, The boosted proximal difference-of-convex algorithm

Full Text: PDF
DOI: 10.23952/jano.5.2023.2.05
Volume 5, Issue 2, 1 August 2023, Pages 255-269

 

Abstract. In this paper, we propose a boosted proximal difference-of-convex algorithm for solving a minimization problem composed of the sum of a smooth convex function and a continuously differentiable convex function minus a continuous and strongly convex function. By adding an additional line search step, the convergence of proximal difference-of-convex algorithm is accelerated. We prove that any limit point of iterative sequence is a critical point of generalized difference-of-convex programming, and the corresponding objective value decreases monotonically and converges. By assuming that the objective function satisfies the strong Kurdyka-Łojasiewicz inequality, we prove the convergence of whole sequence of the proposed algorithm and give the convergence rate. The strong convexity of the convex part of the minimization problem in [F.J. Aragón Artacho, P.T. Vuong, The boosted difference of convex functions algorithm for nonsmooth functions, SIAM J. Optim. 30 (2020), 980-1006] is substituted by the Lipschitz continuity of the gradient of one convex function. Numerical experiments are given to demonstrate the performance of the proposed algorithm compared with the proximal difference-of-convex algorithm.

 

How to Cite this Article:
Z. Hu, Q.L. Dong, X. Qin, The boosted proximal difference-of-convex algorithm, J. Appl. Numer. Optim. 5 (2023), 255-269.