Sadok Jerad
YOU?
Author Swipe
On Global Rates for Regularization Methods based on Secant Derivative Approximations Open
An inexact framework for high-order adaptive regularization methods is presented, in which approximations may be used for the $p$th-order tensor, based on lower-order derivatives. Between each recalculation of the $p$th-order derivative ap…
A Stochastic Objective-Function-Free Adaptive Regularization Method with Optimal Complexity Open
A fully stochastic second-order adaptive-regularization method for unconstrained nonconvex optimization is presented which never computes the objective-function value, but yet achieves the optimal $\mathcal{O}(ε^{-3/2})$ complexity bound f…
Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds constraints Open
A parametric class of trust-region algorithms for constrained nonconvex optimization is analyzed, where the objective function is never computed. By defining appropriate first-order stationarity criteria, we are able to extend the Adagrad …
An adaptive regularization method in Banach spaces Open
International audience
Yet another fast variant of Newton's method for nonconvex optimization Open
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps in an iteration-dependent subspace. In most cases, the Hessian matrix is regul…
Hölder Gradient Descent and Adaptive Regularization Methods in Banach Spaces for First-Order Points Open
This paper considers optimization of smooth nonconvex functionals in smooth infinite dimensional spaces. A Hölder gradient descent algorithm is first proposed for finding approximate first-order points of regularized polynomial functionals…