Spyridon Pougkakiotis
YOU?
Author Swipe
View article: Data-Driven Two-Stage IRS-Aided Sumrate Maximization with Inexact Precoding
Data-Driven Two-Stage IRS-Aided Sumrate Maximization with Inexact Precoding Open
We propose iZoSGA, a data-driven learning algorithm for joint passive long-term intelligent reflective surface (IRS)-aided beamforming and active short-term precoding in wireless networks. iZoSGA is based on a zeroth-order stochastic quasi…
View article: An Efficient Active-Set method With Applications to Sparse Approximations and Risk Minimization
An Efficient Active-Set method With Applications to Sparse Approximations and Risk Minimization Open
In this paper we present an efficient active-set method for the solution of convex quadratic programming problems with general piecewise-linear terms in the objective, with applications to sparse approximations and risk-minimization. The a…
View article: Data-Driven Learning of Two-Stage Beamformers in Passive IRS-Assisted Systems With Inexact Oracles
Data-Driven Learning of Two-Stage Beamformers in Passive IRS-Assisted Systems With Inexact Oracles Open
The purpose of this work is the development of an efficient data-driven and model-free unsupervised learning framework for achieving fully passive intelligent reflective surface (IRS)-assisted optimal joint short/long-term beamforming in w…
View article: Data-Driven Learning of Two-Stage Beamformers in Passive IRS-Assisted Systems with Inexact Oracles
Data-Driven Learning of Two-Stage Beamformers in Passive IRS-Assisted Systems with Inexact Oracles Open
We develop an efficient data-driven and model-free unsupervised learning algorithm for achieving fully passive intelligent reflective surface (IRS)-assisted optimal short/long-term beamforming in wireless communication networks. The propos…
View article: An efficient active-set method with applications to sparse approximations and risk minimization
An efficient active-set method with applications to sparse approximations and risk minimization Open
In this paper we present an efficient active-set method for the solution of convex quadratic programming problems with general piecewise-linear terms in the objective, with applications to sparse approximations and risk-minimization. The a…
View article: Model-Free Learning of Two-Stage Beamformers for Passive IRS-Aided Network Design
Model-Free Learning of Two-Stage Beamformers for Passive IRS-Aided Network Design Open
Electronically tunable metasurfaces, or Intelligent Reflecting Surfaces (IRSs), are a popular technology for achieving high spectral efficiency in modern wireless systems by shaping channels using a multitude of tunable passive reflecting …
View article: Strong Duality Relations in Nonconvex Risk-Constrained Learning
Strong Duality Relations in Nonconvex Risk-Constrained Learning Open
We establish strong duality relations for functional two-step compositional risk-constrained learning problems with multiple nonconvex loss functions and/or learning constraints, regardless of nonconvexity and under a minimal set of techni…
View article: A Zeroth-Order Proximal Stochastic Gradient Method for Weakly Convex Stochastic Optimization
A Zeroth-Order Proximal Stochastic Gradient Method for Weakly Convex Stochastic Optimization Open
In this paper we analyze a zeroth-order proximal stochastic gradient method suitable for the minimization of weakly convex stochastic optimization problems. We consider nonsmooth and nonlinear stochastic composite problems, for which (sub)…
View article: Model-Free Learning of Optimal Beamformers for Passive IRS-Assisted Sumrate Maximization
Model-Free Learning of Optimal Beamformers for Passive IRS-Assisted Sumrate Maximization Open
Although Intelligent Reflective Surfaces (IRSs) are a cost-effective technology promising high spectral efficiency in future wireless networks, obtaining optimal IRS beamformers is a challenging problem with several practical limitations. …
View article: Model-Free Learning of Two-Stage Beamformers for Passive IRS-Aided Network Design
Model-Free Learning of Two-Stage Beamformers for Passive IRS-Aided Network Design Open
Electronically tunable metasurfaces, or Intelligent Reflective Surfaces (IRSs), are a popular technology for achieving high spectral efficiency in modern wireless systems by shaping channels using a multitude of tunable passive reflective …
View article: An active-set method for sparse approximations. Part II: General piecewise-linear terms
An active-set method for sparse approximations. Part II: General piecewise-linear terms Open
In this paper we present an efficient active-set method for the solution of convex quadratic programming problems with general piecewise-linear terms in the objective, with applications to sparse approximations and risk-minimization. The m…
View article: General-purpose preconditioning for regularized interior point methods
General-purpose preconditioning for regularized interior point methods Open
In this paper we present general-purpose preconditioners for regularized augmented systems, and their corresponding normal equations, arising from optimization problems. We discuss positive definite preconditioners, suitable for CG and MIN…
View article: Sparse Approximations with Interior Point Methods
Sparse Approximations with Interior Point Methods Open
Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well-condi…
View article: Model-Free Learning of Optimal Beamformers for Passive IRS-Assisted Sumrate Maximization
Model-Free Learning of Optimal Beamformers for Passive IRS-Assisted Sumrate Maximization Open
Although Intelligent Reflective Surfaces (IRSs) are a cost-effective technology promising high spectral efficiency in future wireless networks, obtaining optimal IRS beamformers is a challenging problem with several practical limitations. …
View article: Strong Duality in Risk-Constrained Nonconvex Functional Programming
Strong Duality in Risk-Constrained Nonconvex Functional Programming Open
We show that a wide class of risk-constrained nonconvex functional optimization problems exhibit strong duality, regardless of nonconvexity. We develop two novel results under distinct sets of assumptions, establishing strong duality over …
View article: A Zeroth-order Proximal Stochastic Gradient Method for Weakly Convex Stochastic Optimization
A Zeroth-order Proximal Stochastic Gradient Method for Weakly Convex Stochastic Optimization Open
In this paper we analyze a zeroth-order proximal stochastic gradient method suitable for the minimization of weakly convex stochastic optimization problems. We consider nonsmooth and nonlinear stochastic composite problems, for which (sub-…
View article: An active-set method for sparse approximations. Part I: Separable $\ell_1$ terms
An active-set method for sparse approximations. Part I: Separable $\ell_1$ terms Open
In this paper we present an active-set method for the solution of $\ell_1$-regularized convex quadratic optimization problems. It is derived by combining a proximal method of multipliers (PMM) strategy with a standard semismooth Newton met…
View article: An Interior Point-Proximal Method of Multipliers for Linear Positive Semi-Definite Programming
An Interior Point-Proximal Method of Multipliers for Linear Positive Semi-Definite Programming Open
In this paper we generalize the Interior Point-Proximal Method of Multipliers (IP-PMM) presented in Pougkakiotis and Gondzio (Comput Optim Appl 78:307–351, 2021. 10.1007/s10589-020-00240-9 ) for the solution of linear positive Semi-Definit…
View article: General-purpose preconditioning for regularized interior point methods
General-purpose preconditioning for regularized interior point methods Open
In this paper we present general-purpose preconditioners for regularized augmented systems arising from optimization problems, and their corresponding normal equations. We discuss positive definite preconditioners, suitable for CG and MINR…
View article: Sparse Approximations with Interior Point Methods
Sparse Approximations with Interior Point Methods Open
Large-scale optimization problems that seek sparse solutions have become ubiquitous. They are routinely solved with various specialized first-order methods. Although such methods are often fast, they usually struggle with not-so-well condi…
View article: A new preconditioning approach for an interior point‐proximal method of multipliers for linear and convex quadratic programming
A new preconditioning approach for an interior point‐proximal method of multipliers for linear and convex quadratic programming Open
In this article, we address the efficient numerical solution of linear and quadratic programming problems, often of large scale. With this aim, we devise an infeasible interior point method, blended with the proximal method of multipliers,…
View article: An Interior Point-Proximal Method of Multipliers for Positive Semi-Definite Programming
An Interior Point-Proximal Method of Multipliers for Positive Semi-Definite Programming Open
In this paper we generalize the Interior Point-Proximal Method of Multipliers (IP-PMM) presented in [An Interior Point-Proximal Method of Multipliers for Convex Quadratic Programming, Computational Optimization and Applications, 78, 307--3…
View article: Fast Solution Methods for Convex Quadratic Optimization of Fractional Differential Equations
Fast Solution Methods for Convex Quadratic Optimization of Fractional Differential Equations Open
In this paper, we present numerical methods suitable for solving convex\nquadratic Fractional Differential Equation (FDE) constrained optimization\nproblems, with box constraints on the state and/or control variables. We\ndevelop an Altern…
View article: Fast Solution Methods for Convex Fractional Differential Equation Optimization
Fast Solution Methods for Convex Fractional Differential Equation Optimization Open
In this paper, we present numerical methods suitable for solving convex Fractional Differential Equation (FDE) optimization problems, with potential box constraints on the state and control variables. First we derive powerful multilevel ci…
View article: Efficient KLMS and KRLS algorithms: A random fourier feature perspective
Efficient KLMS and KRLS algorithms: A random fourier feature perspective Open
We present a new framework for online Least Squares algorithms for nonlinear modeling in RKH spaces (RKHS). Instead of implicitly mapping the data to a RKHS (e.g., kernel trick), we map the data to a finite dimensional Euclidean space, usi…