Jonas Latz
YOU?
Author Swipe
View article: Partial differential equations in data science
Partial differential equations in data science Open
The advent of artificial intelligence and machine learning has led to significant technological and scientific progress, but also to new challenges. Partial differential equations, usually used to model systems in the sciences, have shown …
View article: Sparse Techniques for Regression in Deep Gaussian Processes
Sparse Techniques for Regression in Deep Gaussian Processes Open
Gaussian processes (GPs) have gained popularity as flexible machine learning models for regression and function approximation with an in-built method for uncertainty quantification. However, GPs suffer when the amount of training data is l…
View article: Deep Gaussian process priors for Bayesian image reconstruction
Deep Gaussian process priors for Bayesian image reconstruction Open
In image reconstruction, an accurate quantification of uncertainty is of great importance for informed decision making. Here, the Bayesian approach to inverse problems can be used: the image is represented through a random function that in…
View article: Discrete-to-continuum limits of semilinear stochastic evolution equations in Banach spaces
Discrete-to-continuum limits of semilinear stochastic evolution equations in Banach spaces Open
We study the convergence of semilinear parabolic stochastic evolution equations, posed on a sequence of Banach spaces approximating a limiting space and driven by additive white noise projected onto the former spaces. Under appropriate uni…
View article: How to beat a Bayesian adversary
How to beat a Bayesian adversary Open
Deep neural networks and other modern machine learning models are often susceptible to adversarial attacks. Indeed, an adversary may often be able to change a model’s prediction through a small, directed perturbation of the model’s input –…
View article: Deep Gaussian Process Priors for Bayesian Image Reconstruction
Deep Gaussian Process Priors for Bayesian Image Reconstruction Open
In image reconstruction, an accurate quantification of uncertainty is of great importance for informed decision making. Here, the Bayesian approach to inverse problems can be used: the image is represented through a random function that in…
View article: A Learnable Prior Improves Inverse Tumor Growth Modeling
A Learnable Prior Improves Inverse Tumor Growth Modeling Open
Biophysical modeling, particularly involving partial differential equations (PDEs), offers significant potential for tailoring disease treatment protocols to individual patients. However, the inverse problem-solving aspect of these models …
View article: A parameterization of anisotropic Gaussian fields with penalized complexity priors
A parameterization of anisotropic Gaussian fields with penalized complexity priors Open
Gaussian random fields (GFs) are fundamental tools in spatial modeling and can be represented flexibly and efficiently as solutions to stochastic partial differential equations (SPDEs). The SPDEs depend on specific parameters, which enforc…
View article: The random timestep Euler method and its continuous dynamics
The random timestep Euler method and its continuous dynamics Open
ODE solvers with randomly sampled timestep sizes appear in the context of chaotic dynamical systems, differential equations with low regularity, and, implicitly, in stochastic optimisation. In this work, we propose and study the stochastic…
View article: How to beat a Bayesian adversary
How to beat a Bayesian adversary Open
Deep neural networks and other modern machine learning models are often susceptible to adversarial attacks. Indeed, an adversary may often be able to change a model's prediction through a small, directed perturbation of the model's input -…
View article: Correction to: analysis of stochastic gradient descent in continuous time
Correction to: analysis of stochastic gradient descent in continuous time Open
A correction regarding [Latz 2021, Stat. Comput. 31, 39].
View article: Data-driven approximation of Koopman operators and generators: Convergence rates and error bounds
Data-driven approximation of Koopman operators and generators: Convergence rates and error bounds Open
Global information about dynamical systems can be extracted by analysing associated infinite-dimensional transfer operators, such as Perron-Frobenius and Koopman operators as well as their infinitesimal generators. In practice, these opera…
View article: Adaptive stepsize algorithms for Langevin dynamics
Adaptive stepsize algorithms for Langevin dynamics Open
We discuss the design of an invariant measure-preserving transformed dynamics for the numerical treatment of Langevin dynamics based on rescaling of time, with the goal of sampling from an invariant measure. Given an appropriate monitor fu…
View article: Can physics-informed neural networks beat the finite element method?
Can physics-informed neural networks beat the finite element method? Open
Partial differential equations (PDEs) play a fundamental role in the mathematical modelling of many processes and systems in physical, biological and other sciences. To simulate such processes and systems, the solutions of PDEs often need …
View article: Nested Sampling for Uncertainty Quantification and Rare Event Estimation
Nested Sampling for Uncertainty Quantification and Rare Event Estimation Open
Nested Sampling is a method for computing the Bayesian evidence, also called the marginal likelihood, which is the integral of the likelihood with respect to the prior. More generally, it is a numerical probabilistic quadrature rule. The m…
View article: Subsampling in ensemble Kalman inversion
Subsampling in ensemble Kalman inversion Open
We consider the ensemble Kalman inversion (EKI) which has been recently introduced as an efficient, gradient-free optimisation method to estimate unknown parameters in an inverse setting. In the case of large data sets, the EKI becomes com…
View article: Subsampling Error in Stochastic Gradient Langevin Diffusions
Subsampling Error in Stochastic Gradient Langevin Diffusions Open
The Stochastic Gradient Langevin Dynamics (SGLD) are popularly used to approximate Bayesian posterior distributions in statistical learning procedures with large-scale data. As opposed to many usual Markov chain Monte Carlo (MCMC) algorith…
View article: Subsampling in ensemble Kalman inversion
Subsampling in ensemble Kalman inversion Open
We consider the Ensemble Kalman Inversion which has been recently introduced as an efficient, gradient-free optimisation method to estimate unknown parameters in an inverse setting. In the case of large data sets, the Ensemble Kalman Inver…
View article: Can Physics-Informed Neural Networks beat the Finite Element Method?
Can Physics-Informed Neural Networks beat the Finite Element Method? Open
Partial differential equations play a fundamental role in the mathematical modelling of many processes and systems in physical, biological and other sciences. To simulate such processes and systems, the solutions of PDEs often need to be a…
View article: A Continuous-time Stochastic Gradient Descent Method for Continuous Data
A Continuous-time Stochastic Gradient Descent Method for Continuous Data Open
Optimization problems with continuous data appear in, e.g., robust machine learning, functional data analysis, and variational inference. Here, the target function is given as an integral over a family of (continuously) indexed target func…
View article: Gradient flows and randomised thresholding: sparse inversion and classification*
Gradient flows and randomised thresholding: sparse inversion and classification* Open
Sparse inversion and classification problems are ubiquitous in modern data science and imaging. They are often formulated as non-smooth minimisation problems. In sparse inversion, we minimise, e.g., the sum of a data fidelity term and an L…
View article: Losing momentum in continuous-time stochastic optimisation
Losing momentum in continuous-time stochastic optimisation Open
The training of modern machine learning models often consists in solving high-dimensional non-convex optimisation problems that are subject to large-scale data. In this context, momentum-based stochastic optimisation algorithms have become…
View article: Joint reconstruction-segmentation on graphs
Joint reconstruction-segmentation on graphs Open
Practical image segmentation tasks concern images which must be reconstructed from noisy, distorted, and/or incomplete observations. A recent approach for solving such tasks is to perform this reconstruction jointly with the segmentation, …
View article: Gradient flows and randomised thresholding: sparse inversion and classification
Gradient flows and randomised thresholding: sparse inversion and classification Open
Sparse inversion and classification problems are ubiquitous in modern data science and imaging. They are often formulated as non-smooth minimisation problems. In sparse inversion, we minimise, e.g., the sum of a data fidelity term and an L…
View article: Gaussian random fields on non-separable Banach spaces
Gaussian random fields on non-separable Banach spaces Open
We study Gaussian random fields on certain Banach spaces and investigate conditions for their existence. Our results apply inter alia to spaces of Radon measures and Hölder functions. In the former case, we are able to define Gaussian whit…
View article: A Continuous-time Stochastic Gradient Descent Method for Continuous Data
A Continuous-time Stochastic Gradient Descent Method for Continuous Data Open
Optimization problems with continuous data appear in, e.g., robust machine learning, functional data analysis, and variational inference. Here, the target function is given as an integral over a family of (continuously) indexed target func…
View article: image segmentation code
image segmentation code Open
Code for graph-based image segmentation used in "Classification and image processing with a semi-discrete scheme for fidelity forced Allen–Cahn on graphs" https://doi.org/10.1002/gamm.202100004
View article: Generalized parallel tempering on Bayesian inverse problems
Generalized parallel tempering on Bayesian inverse problems Open
In the current work we present two generalizations of the Parallel Tempering algorithm in the context of discrete-time Markov chain Monte Carlo methods for Bayesian inverse problems. These generalizations use state-dependent swapping rates…
View article: Bayesian inference with subset simulation in varying dimensions applied to the Karhunen–Loève expansion
Bayesian inference with subset simulation in varying dimensions applied to the Karhunen–Loève expansion Open
Uncertainties associated with spatially varying parameters are modeled through random fields discretized into a finite number of random variables. Standard discretization methods, such as the Karhunen–Loève expansion, use series representa…
View article: Analysis of stochastic gradient descent in continuous time
Analysis of stochastic gradient descent in continuous time Open