Ignace Loris
YOU?
Author Swipe
View article: On a Fixed-Point Continuation Method for a Convex Optimization Problem
On a Fixed-Point Continuation Method for a Convex Optimization Problem Open
We consider a variation of the classical proximal-gradient algorithm for the iterative minimization of a cost function consisting of a sum of two terms, one smooth and the other prox-simple, and whose relative weight is determined by a pen…
View article: Convergence analysis of a primal-dual optimization-by-continuation algorithm
Convergence analysis of a primal-dual optimization-by-continuation algorithm Open
We present a numerical iterative optimization algorithm for the minimization of a cost function consisting of a linear combination of three convex terms, one of which is differentiable, a second one is prox-simple and the third one is the …
View article: On a fixed-point continuation method for a convex optimization problem
On a fixed-point continuation method for a convex optimization problem Open
We consider a variation of the classical proximal-gradient algorithm for the iterative minimization of a cost function consisting of a sum of two terms, one smooth and the other prox-simple, and whose relative weight is determined by a pen…
View article: Primal-dual splitting scheme with backtracking for handling with epigraphic constraint and sparse analysis regularization
Primal-dual splitting scheme with backtracking for handling with epigraphic constraint and sparse analysis regularization Open
in Proceedings of iTWIST'20, Paper-ID: 12, Nantes, France, December, 2-4, 2020
View article: Primal-dual splitting scheme with backtracking for handling with epigraphic constraint and sparse analysis regularization
Primal-dual splitting scheme with backtracking for handling with epigraphic constraint and sparse analysis regularization Open
The convergence of many proximal algorithms involving a gradient descent relies on its Lipschitz constant. To avoid computing it, backtracking rules can be used. While such a rule has already been designed for the forward-backward algorith…
View article: On starting and stopping criteria for nested primal-dual iterations
On starting and stopping criteria for nested primal-dual iterations Open
The importance of an adequate inner loop starting point (as opposed to a sufficient inner loop stopping rule) is discussed in the context of a numerical optimization algorithm consisting of nested primal-dual proximal-gradient iterations. …
View article: On the convergence of a linesearch based proximal-gradient method for nonconvex optimization
On the convergence of a linesearch based proximal-gradient method for nonconvex optimization Open
We consider a variable metric line-search based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a…
View article: On the constrained minimization of smooth Kurdyka—Łojasiewicz functions with the scaled gradient projection method
On the constrained minimization of smooth Kurdyka—Łojasiewicz functions with the scaled gradient projection method Open
The scaled gradient projection (SGP) method is a first-order optimization method applicable to the constrained minimization of smooth functions and exploiting a scaling matrix multiplying the gradient and a variable steplength parameter to…
View article: Proceedings of the third "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'16)
Proceedings of the third "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'16) Open
The third edition of the "international - Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST) took place in Aalborg, the 4th largest city in Denmark situated beautifully in the northern part of the country, fr…
View article: On the convergence of variable metric line-search based proximal-gradient method under the Kurdyka-Lojasiewicz inequality
On the convergence of variable metric line-search based proximal-gradient method under the Kurdyka-Lojasiewicz inequality Open
We consider a variable metric line-search based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. The general convergence result on this method is the …
View article: Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization
Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization Open
We develop a new proximal-gradient method for minimizing the sum of a differentiable, possibly nonconvex, function plus a convex, possibly nondifferentiable, function. The key features of the proposed method are the definition of a suitabl…