Clarice Poon
YOU?
Author Swipe
View article: On the non-convexity issue in the radial Calderón problem
On the non-convexity issue in the radial Calderón problem Open
A classical approach to the Calderón problem is to estimate the unknown conductivity by solving a nonlinear least-squares problem. It leads to a nonconvex optimization problem which is generally believed to be riddled with bad local minimu…
View article: Learning from Samples: Inverse Problems over measures via Sharpened Fenchel-Young Losses
Learning from Samples: Inverse Problems over measures via Sharpened Fenchel-Young Losses Open
Estimating parameters from samples of an optimal probability distribution is essential in applications ranging from socio-economic modeling to biological system analysis. In these settings, the probability distribution arises as the soluti…
View article: Hadamard Langevin dynamics for sampling the l1-prior
Hadamard Langevin dynamics for sampling the l1-prior Open
Priors with non-smooth log-densities, such as the l1-prior, are widely used in Bayesian inverse problems for their sparsity-inducing properties. Existing Langevin-based sampling methods typically rely on proximal mappings or smooth approxi…
View article: Super-resolved Lasso
Super-resolved Lasso Open
Super-resolution of pointwise sources is of utmost importance in various areas of imaging sciences. Specific instances of this problem arise in single molecule fluorescence, spike sorting in neuroscience, astrophysical imaging, radar imagi…
View article: Sparsistency for Inverse Optimal Transport
Sparsistency for Inverse Optimal Transport Open
Optimal Transport is a useful metric to compare probability distributions and to compute a pairing given a ground cost. Its entropic regularization variant (eOT) is crucial to have fast algorithms and reflect fuzzy/noisy matchings. This wo…
View article: Compressed online Sinkhorn
Compressed online Sinkhorn Open
The use of optimal transport (OT) distances, and in particular entropic-regularised OT distances, is an increasingly popular evaluation metric in many areas of machine learning and data science. Their use has largely been driven by the ava…
View article: Variable Screening for Sparse Online Regression
Variable Screening for Sparse Online Regression Open
Sparsity-promoting regularizers are widely used to impose low-complexity structure (e.g., l1-norm for sparsity) to the regression coefficients of supervised learning. In the realm of deterministic optimization, the sequence generated by it…
View article: An off-the-grid approach to multi-compartment magnetic resonance fingerprinting
An off-the-grid approach to multi-compartment magnetic resonance fingerprinting Open
We propose a novel numerical approach to separate multiple tissue compartments in image voxels and to estimate quantitatively their nuclear magnetic resonance (NMR) properties and mixture fractions, given magnetic resonance fingerprinting …
View article: Smooth over-parameterized solvers for non-smooth structured optimization
Smooth over-parameterized solvers for non-smooth structured optimization Open
Non-smooth optimization is a core ingredient of many imaging or machine learning pipelines. Non-smoothness encodes structural constraints on the solutions, such as sparsity, group sparsity, low-rank and sharp edges. It is also the basis fo…
View article: Smooth Bilevel Programming for Sparse Regularization
Smooth Bilevel Programming for Sparse Regularization Open
Iteratively reweighted least square (IRLS) is a popular approach to solve sparsity-enforcing regression problems in machine learning. State of the art approaches are more efficient but typically rely on specific coordinate pruning schemes.…
View article: Screening for Sparse Online Learning
Screening for Sparse Online Learning Open
Sparsity promoting regularizers are widely used to impose low-complexity structure (e.g. l1-norm for sparsity) to the regression coefficients of supervised learning. In the realm of deterministic optimization, the sequence generated by ite…
View article: An off-the-grid approach to multi-compartment magnetic resonance fingerprinting
An off-the-grid approach to multi-compartment magnetic resonance fingerprinting Open
We propose a novel numerical approach to separate multiple tissue compartments in image voxels and to estimate quantitatively their nuclear magnetic resonance (NMR) properties and mixture fractions, given magnetic resonance fingerprinting …
View article: On instabilities of deep learning in image reconstruction and the potential costs of AI
On instabilities of deep learning in image reconstruction and the potential costs of AI Open
Deep learning, due to its unprecedented success in tasks such as image classification, has emerged as a new tool in image reconstruction with potential to change the field. In this paper, we demonstrate a crucial phenomenon: Deep learning …
View article: Geometry of First-Order Methods and Adaptive Acceleration
Geometry of First-Order Methods and Adaptive Acceleration Open
First-order operator splitting methods are ubiquitous among many fields through science and engineering, such as inverse problems, signal/image processing, statistics, data science and machine learning, to name a few. In this paper, we stu…
View article: Degrees of freedom for off-the-grid sparse estimation
Degrees of freedom for off-the-grid sparse estimation Open
A central question in modern machine learning and imaging sciences is to quantify the number of effective parameters of vastly over-parameterized models. The degrees of freedom is a mathematically convenient way to define this number of pa…
View article: Trajectory of Alternating Direction Method of Multipliers and Adaptive Acceleration
Trajectory of Alternating Direction Method of Multipliers and Adaptive Acceleration Open
The alternating direction method of multipliers (ADMM) is one of the most widely used first-order optimisation methods in the literature owing to its simplicity, flexibility and efficiency. Over the years, numerous efforts are made to impr…
View article: MultiDimensional Sparse Super-Resolution
MultiDimensional Sparse Super-Resolution Open
This paper studies sparse super-resolution in arbitrary dimensions. More precisely, it develops a theoretical analysis of support recovery for the so-called BLASSO method, which is an off-the-grid generalisation of l1 regularization (also …
View article: Support Localization and the Fisher Metric for off-the-grid Sparse Regularization
Support Localization and the Fisher Metric for off-the-grid Sparse Regularization Open
Sparse regularization is a central technique for both machine learning (to achieve supervised features selection or unsupervised mixture learning) and imaging sciences (to achieve super-resolution). Existing performance guaranties assume a…
View article: On the total variation Wasserstein gradient flow and the TV-JKO scheme
On the total variation Wasserstein gradient flow and the TV-JKO scheme Open
We study the JKO scheme for the total variation, characterize the optimizers, prove some of their qualitative properties (in particular a form of maximum principle and in some cases, a minimum principle as well). Finally, we establish a co…
View article: The geometry of off-the-grid compressed sensing
The geometry of off-the-grid compressed sensing Open
This paper presents a sharp geometric analysis of the recovery performance of sparse regularization. More specifically, we analyze the BLASSO method which estimates a sparse measure (sum of Dirac masses) from randomized sub-sampled measure…
View article: A Dual Certificates Analysis of Compressive Off-the-Grid Recovery.
A Dual Certificates Analysis of Compressive Off-the-Grid Recovery. Open
Many problems in machine learning and imaging can be framed as an infinite dimensional Lasso problem to estimate a sparse measure. This includes for instance regression using a continuously parameterized dictionary, mixture model estimatio…
View article: Local Convergence Properties of SAGA/Prox-SVRG and Acceleration
Local Convergence Properties of SAGA/Prox-SVRG and Acceleration Open
Over the past ten years, driven by large scale optimisation problems arising from machine learning, the development of stochastic optimisation methods have witnessed a tremendous growth. However, despite their popularity, the theoretical u…
View article: Sampling the Fourier Transform Along Radial Lines
Sampling the Fourier Transform Along Radial Lines Open
This article considers the use of total variation minimization for the recovery of a superposition of point sources from samples of its Fourier transform along radial lines. We present a numerical algorithm for the computation of solutions…
View article: BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING
BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING Open
This paper presents a framework for compressed sensing that bridges a gap between existing theory and the current use of compressed sensing in many real-world applications. In doing so, it also introduces a new sampling method that yields …
View article: Sampling the Fourier transform along radial lines
Sampling the Fourier transform along radial lines Open
This article considers the use of total variation minimization for the recovery of a superposition of point sources from samples of its Fourier transform along radial lines. We present a numerical algorithm for the computation of solutions…
View article: Geometric properties of solutions to the total variation denoising problem
Geometric properties of solutions to the total variation denoising problem Open
This article studies the denoising performance of total variation (TV) image regularization. More precisely, we study geometrical properties of the solution to the so-called Rudin-Osher-Fatemi total variation denoising method. The first co…
View article: Total Variation Denoising and Support Localization of the Gradient
Total Variation Denoising and Support Localization of the Gradient Open
This paper describes the geometrical properties of the solutions to the total variation denoising method. A folklore statement is that this method is able to restore sharp edges, but at the same time, might introduce some staircasing (i.e.…
View article: Spectral study of the Laplace–Beltrami operator arising in the problem of acoustic wave scattering by a quarter-plane
Spectral study of the Laplace–Beltrami operator arising in the problem of acoustic wave scattering by a quarter-plane Open
The Laplace-Beltrami operator (LBO) on a sphere with a cut arises when considering the problem of wave scattering by a quarter-plane. Recent methods developed for sound-soft (Dirichlet) and sound-hard (Neumann) quarter-planes rely on an a …
View article: On Cartesian line sampling with anisotropic total variation regularization
On Cartesian line sampling with anisotropic total variation regularization Open
This paper considers the use of the anisotropic total variation seminorm to recover a two dimensional vector $x\in \mathbb{C}^{N\times N}$ from its partial Fourier coefficients, sampled along Cartesian lines. We prove that if $(x_{k,j} - x…