Oscar Leong
YOU?
Author Swipe
View article: Optimal Regularization Under Uncertainty: Distributional Robustness and Convexity Constraints
Optimal Regularization Under Uncertainty: Distributional Robustness and Convexity Constraints Open
Regularization is a central tool for addressing ill-posedness in inverse problems and statistical estimation, with the choice of a suitable penalty often determining the reliability and interpretability of downstream solutions. While recen…
View article: Learning Regularization Functionals for Inverse Problems: A Comparative Study
Learning Regularization Functionals for Inverse Problems: A Comparative Study Open
In recent years, a variety of learned regularization frameworks for solving inverse problems in imaging have emerged. These offer flexible modeling together with mathematical insights. The proposed methods differ in their architectural des…
View article: A Recovery Theory for Diffusion Priors: Deterministic Analysis of the Implicit Prior Algorithm
A Recovery Theory for Diffusion Priors: Deterministic Analysis of the Implicit Prior Algorithm Open
Recovering high-dimensional signals from corrupted measurements is a central challenge in inverse problems. Recent advances in generative diffusion models have shown remarkable empirical success in providing strong data-driven priors, but …
View article: Restoration Score Distillation: From Corrupted Diffusion Pretraining to One-Step High-Quality Generation
Restoration Score Distillation: From Corrupted Diffusion Pretraining to One-Step High-Quality Generation Open
Learning generative models from corrupted data is a fundamental yet persistently challenging task across scientific disciplines, particularly when access to clean data is limited or expensive. Denoising Score Distillation (DSD) \cite{chen2…
View article: Denoising Score Distillation: From Noisy Diffusion Pretraining to One-Step High-Quality Generation
Denoising Score Distillation: From Noisy Diffusion Pretraining to One-Step High-Quality Generation Open
Diffusion models have achieved remarkable success in generating high-resolution, realistic images across diverse natural distributions. However, their performance heavily relies on high-quality training data, making it challenging to learn…
View article: Learning Difference-of-Convex Regularizers for Inverse Problems: A Flexible Framework with Theoretical Guarantees
Learning Difference-of-Convex Regularizers for Inverse Problems: A Flexible Framework with Theoretical Guarantees Open
Learning effective regularization is crucial for solving ill-posed inverse problems, which arise in a wide range of scientific and engineering applications. While data-driven methods that parameterize regularizers using deep neural network…
View article: The Star Geometry of Critic-Based Regularizer Learning
The Star Geometry of Critic-Based Regularizer Learning Open
Variational regularization is a classical technique to solve statistical inference tasks and inverse problems, with modern data-driven approaches parameterizing regularizers via deep neural networks showcasing impressive empirical performa…
View article: Flow Priors for Linear Inverse Problems via Iterative Corrupted Trajectory Matching
Flow Priors for Linear Inverse Problems via Iterative Corrupted Trajectory Matching Open
Generative models based on flow matching have attracted significant attention for their simplicity and superior performance in high-resolution image synthesis. By leveraging the instantaneous change-of-variables formula, one can directly c…
View article: Score-based Diffusion Models for Photoacoustic Tomography Image Reconstruction
Score-based Diffusion Models for Photoacoustic Tomography Image Reconstruction Open
Photoacoustic tomography (PAT) is a rapidly-evolving medical imaging modality\nthat combines optical absorption contrast with ultrasound imaging depth. One\nchallenge in PAT is image reconstruction with inadequate acoustic signals due\nto …
View article: Discovering Structure From Corruption for Unsupervised Image Reconstruction
Discovering Structure From Corruption for Unsupervised Image Reconstruction Open
We consider solving ill-posed imaging inverse problems without access to an image prior or ground-truth examples. An overarching challenge in these inverse problems is that an infinite number of images, including many that are implausible,…
View article: Image Reconstruction without Explicit Priors
Image Reconstruction without Explicit Priors Open
We consider solving ill-posed imaging inverse problems without access to an explicit image prior or ground-truth examples. An overarching challenge in inverse problems is that there are many undesired images that fit to the observed measur…
View article: Optimal Regularization for a Data Source
Optimal Regularization for a Data Source Open
In optimization-based approaches to inverse problems and to statistical estimation, it is common to augment criteria that enforce data fidelity with a regularizer that promotes desired structural properties in the solution. The choice of a…
View article: Alternating Phase Langevin Sampling with Implicit Denoiser Priors for Phase Retrieval
Alternating Phase Langevin Sampling with Implicit Denoiser Priors for Phase Retrieval Open
Phase retrieval is the nonlinear inverse problem of recovering a true signal from its Fourier magnitude measurements. It arises in many applications such as astronomical imaging, X-Ray crystallography, microscopy, and more. The problem is …
View article: Optimal Sample Complexity of Gradient Descent for Amplitude Flow via Non-Lipschitz Matrix Concentration
Optimal Sample Complexity of Gradient Descent for Amplitude Flow via Non-Lipschitz Matrix Concentration Open
We consider the problem of recovering a real-valued $n$-dimensional signal from $m$ phaseless, linear measurements and analyze the amplitude-based non-smooth least squares objective. We establish local convergence of gradient descent with …
View article: Optimal Sample Complexity of Subgradient Descent for Amplitude Flow via Non-Lipschitz Matrix Concentration
Optimal Sample Complexity of Subgradient Descent for Amplitude Flow via Non-Lipschitz Matrix Concentration Open
We consider the problem of recovering a real-valued $n$-dimensional signal from $m$ phaseless, linear measurements and analyze the amplitude-based non-smooth least squares objective. We establish local convergence of subgradient descent wi…
View article: Compressive Phase Retrieval: Optimal Sample Complexity with Deep Generative Priors
Compressive Phase Retrieval: Optimal Sample Complexity with Deep Generative Priors Open
Advances in compressive sensing provided reconstruction algorithms of sparse signals from linear measurements with optimal sample complexity, but natural extensions of this methodology to nonlinear inverse problems have been met with poten…
View article: Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems
Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems Open
Employing deep neural networks as natural image priors to solve inverse problems either requires large amounts of data to sufficiently train expressive generative models or can succeed with no data via untrained neural networks. However, v…
View article: Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems
Low Shot Learning with Untrained Neural Networks for Imaging Inverse Problems Open
Employing deep neural networks as natural image priors to solve inverse problems either requires large amounts of data to sufficiently train expressive generative models or can succeed with no data via untrained neural networks. However, v…
View article: Invertible generative models for inverse problems: mitigating representation error and dataset bias
Invertible generative models for inverse problems: mitigating representation error and dataset bias Open
Trained generative models have shown remarkable performance as priors for inverse problems in imaging -- for example, Generative Adversarial Network priors permit recovery of test images from 5-10x fewer measurements than sparsity priors. …
View article: Invertible generative models for inverse problems: mitigating\n representation error and dataset bias
Invertible generative models for inverse problems: mitigating\n representation error and dataset bias Open
Trained generative models have shown remarkable performance as priors for\ninverse problems in imaging -- for example, Generative Adversarial Network\npriors permit recovery of test images from 5-10x fewer measurements than\nsparsity prior…
View article: Phase Retrieval Under a Generative Prior
Phase Retrieval Under a Generative Prior Open
The phase retrieval problem asks to recover a natural signal $y_0 \in \mathbb{R}^n$ from $m$ quadratic observations, where $m$ is to be minimized. As is common in many imaging problems, natural signals are considered sparse with respect to…
View article: Proving Tucker's Lemma with a Volume Argument
Proving Tucker's Lemma with a Volume Argument Open
Sperner's lemma is a statement about labeled triangulations of a simplex. McLennan and Tourky (2007) provided a novel proof of Sperner's Lemma by examining volumes of simplices in a triangulation under time-linear simplex-linear deformatio…