Andrea Cristofari
YOU?
Author Swipe
View article: Probabilistic iterative hard thresholding for sparse learning
Probabilistic iterative hard thresholding for sparse learning Open
For statistical modeling wherein the data regime is unfavorable in terms of dimensionality relative to the sample size, finding hidden sparsity in the relationship structure between variables can be critical in formulating an accurate stat…
View article: Full Convergence of Regularized Methods for Unconstrained Optimization
Full Convergence of Regularized Methods for Unconstrained Optimization Open
Typically, the sequence of points generated by an optimization algorithm may have multiple limit points. Under convexity assumptions, however, (sub)gradient methods are known to generate a convergent sequence of points. In this paper, we e…
View article: Worst-Case Complexity of High-Order Algorithms for Pareto-Front Reconstruction
Worst-Case Complexity of High-Order Algorithms for Pareto-Front Reconstruction Open
In this paper, we are concerned with a worst-case complexity analysis of a-posteriori algorithms for unconstrained multiobjective optimization. Specifically, we propose an algorithmic framework that generates sets of points by means of $p$…
View article: Probabilistic Iterative Hard Thresholding for Sparse Learning
Probabilistic Iterative Hard Thresholding for Sparse Learning Open
For statistical modeling wherein the data regime is unfavorable in terms of dimensionality relative to the sample size, finding hidden sparsity in the ground truth can be critical in formulating an accurate statistical model. The so-called…
View article: Block cubic Newton with greedy selection
Block cubic Newton with greedy selection Open
A second-order block coordinate descent method is proposed for the unconstrained minimization of an objective function with a Lipschitz continuous Hessian. At each iteration, a block of variables is selected by means of a greedy (Gauss-Sou…
View article: On Necessary Optimality Conditions for Sets of Points in Multiobjective Optimization
On Necessary Optimality Conditions for Sets of Points in Multiobjective Optimization Open
View article: Complexity results and active-set identification of a derivative-free method for bound-constrained problems
Complexity results and active-set identification of a derivative-free method for bound-constrained problems Open
In this paper, we analyze a derivative-free line search method designed for bound-constrained problems. Our analysis demonstrates that this method exhibits a worst-case complexity comparable to other derivative-free methods for unconstrain…
View article: Learning the Right Layers: a Data-Driven Layer-Aggregation Strategy for Semi-Supervised Learning on Multilayer Graphs
Learning the Right Layers: a Data-Driven Layer-Aggregation Strategy for Semi-Supervised Learning on Multilayer Graphs Open
Clustering (or community detection) on multilayer graphs poses several additional complications with respect to standard graphs as different layers may be characterized by different structures and types of information. One of the major cha…
View article: Machine Learning-Based Classification to Disentangle EEG Responses to TMS and Auditory Input
Machine Learning-Based Classification to Disentangle EEG Responses to TMS and Auditory Input Open
The combination of transcranial magnetic stimulation (TMS) and electroencephalography (EEG) offers an unparalleled opportunity to study cortical physiology by characterizing brain electrical responses to external perturbation, called trans…
View article: Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by Coordinate Descent
Laplacian-based Semi-Supervised Learning in Multilayer Hypergraphs by Coordinate Descent Open
Graph Semi-Supervised learning is an important data analysis tool, where given a graph and a set of labeled nodes, the aim is to infer the labels to the remaining unlabeled nodes. In this paper, we start by considering an optimization-base…
View article: Laplacian-based semi-Supervised learning in multilayer hypergraphs by coordinate descent
Laplacian-based semi-Supervised learning in multilayer hypergraphs by coordinate descent Open
View article: Minimization over the $$\ell _1$$-ball using an active-set non-monotone projected gradient
Minimization over the $$\ell _1$$-ball using an active-set non-monotone projected gradient Open
The $$\ell _1$$ -ball is a nicely structured feasible set that is widely used in many fields (e.g., machine learning, statistics and signal analysis) to enforce some sparsity in the model solutions. In this paper, we devise an active-set …
View article: Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search
Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search Open
This paper establishes finite active-set identification of an almost cyclic 2-coordinate descent method for problems with one linear coupling constraint and simple bounds. First, general active-set identification results are stated for non…
View article: A decomposition method for lasso problems with zero-sum constraint
A decomposition method for lasso problems with zero-sum constraint Open
In this paper, we consider lasso problems with zero-sum constraint, commonly required for the analysis of compositional data in high-dimensional spaces. A novel algorithm is proposed to solve these problems, combining a tailored active-set…
View article: An Augmented Lagrangian Method Exploiting an Active-Set Strategy and Second-Order Information
An Augmented Lagrangian Method Exploiting an Active-Set Strategy and Second-Order Information Open
View article: Minimization over the l1-ball using an active-set non-monotone projected gradient
Minimization over the l1-ball using an active-set non-monotone projected gradient Open
The l1-ball is a nicely structured feasible set that is widely used in many fields (e.g., machine learning, statistics and signal analysis) to enforce some sparsity in the model solutions. In this paper, we devise an active-set strategy fo…
View article: Minimization over the l1-ball using an active-set non-monotone projected\n gradient
Minimization over the l1-ball using an active-set non-monotone projected\n gradient Open
The l1-ball is a nicely structured feasible set that is widely used in many\nfields (e.g., machine learning, statistics and signal analysis) to enforce some\nsparsity in the model solutions. In this paper, we devise an active-set\nstrategy…
View article: Louvain-like Methods for Community Detection in Multi-Layer Networks.
Louvain-like Methods for Community Detection in Multi-Layer Networks. Open
In many complex systems, entities interact with each other through complicated patterns that embed different relationships, thus generating networks with multiple levels and/or multiple types of edges. When trying to improve our understand…
View article: A Variance-aware Multiobjective Louvain-like Method for Community Detection in Multiplex Networks
A Variance-aware Multiobjective Louvain-like Method for Community Detection in Multiplex Networks Open
In this paper, we focus on the community detection problem in multiplex networks, i.e., networks with multiple layers having same node sets and no inter-layer connections. In particular, we look for groups of nodes that can be recognized a…
View article: An augmented Lagrangian method exploiting an active-set strategy and second-order information
An augmented Lagrangian method exploiting an active-set strategy and second-order information Open
In this paper, we consider nonlinear optimization problems with nonlinear equality constraints and bound constraints on the variables. For the solution of such problems, many augmented Lagrangian methods have been defined in the literature…
View article: An augmented Lagrangian method exploiting second-order information
An augmented Lagrangian method exploiting second-order information Open
In this paper, we consider nonlinear optimization problems with nonlinear equality constraints and bound constraints on the variables. For the solution of such problems, many augmented Lagrangian methods have been defined in the literature…
View article: Active-set identification with complexity guarantees of an almost cyclic 2-coordinate descent method with Armijo line search
Active-set identification with complexity guarantees of an almost cyclic 2-coordinate descent method with Armijo line search Open
In this paper, it is established finite active-set identification of an almost cyclic 2-coordinate descent method for problems with one linear coupling constraint and simple bounds. First, general active-set identification results are stat…
View article: A Derivative-Free Method for Structured Optimization Problems
A Derivative-Free Method for Structured Optimization Problems Open
Structured optimization problems are ubiquitous in fields like data science and engineering. The goal in structured optimization is using a prescribed set of points, called atoms, to build up a solution that minimizes or maximizes a given …
View article: An almost cyclic 2-coordinate descent method for singly linearly constrained problems
An almost cyclic 2-coordinate descent method for singly linearly constrained problems Open
View article: Data and performance of an active-set truncated Newton method with non-monotone line search for bound-constrained optimization
Data and performance of an active-set truncated Newton method with non-monotone line search for bound-constrained optimization Open
In this data article, we report data and experiments related to the research article entitled "A Two-Stage Active-Set Algorithm for Bound-Constrained Optimization", by Cristofari et al. (2017). The method proposed in Cristofari et al. (201…
View article: On local non-global minimizers of quadratic functions with cubic regularization
On local non-global minimizers of quadratic functions with cubic regularization Open
In this paper, we analyze some theoretical properties of the problem of minimizing a quadratic function with a cubic regularization term, arising in many methods for unconstrained and constrained optimization that have been proposed in the…
View article: New Active-Set Frank-Wolfe Variants for Minimization over the Simplex and the $\ell_1$-Ball
New Active-Set Frank-Wolfe Variants for Minimization over the Simplex and the $\ell_1$-Ball Open
In this paper, we describe a new active-set algorithmic framework for minimizing a function over the simplex. The method is quite general and encompasses different active-set Frank-Wolfe variants. In particular, we analyze convergence (whe…
View article: Large-scale optimization: new active-set methods and application in unsupervised learning
Large-scale optimization: new active-set methods and application in unsupervised learning Open
In this thesis, new methods for large-scale non-linear optimization are presented. In particular, an active-set algorithm for bound-constrained optimization is first proposed, characterized by the use of a suitable active-set estimate that…