Ingmar Schuster
YOU?
Author Swipe
View article: Machine Learning-Supported Enzyme Engineering toward Improved CO<sub>2</sub>-Fixation of Glycolyl-CoA Carboxylase
Machine Learning-Supported Enzyme Engineering toward Improved CO<sub>2</sub>-Fixation of Glycolyl-CoA Carboxylase Open
Glycolyl-CoA carboxylase (GCC) is a new-to-nature enzyme that catalyzes the key reaction in the tartronyl-CoA (TaCo) pathway, a synthetic photorespiration bypass that was recently designed to improve photosynthetic CO2 fixation. GCC was cr…
View article: Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows
Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows Open
Time series forecasting is often fundamental to scientific and engineering problems and enables decision making. With ever increasing data set sizes, a trivial solution to scale up predictions is to assume independence between interacting …
View article: Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting
Autoregressive Denoising Diffusion Models for Multivariate Probabilistic Time Series Forecasting Open
In this work, we propose \texttt{TimeGrad}, an autoregressive model for multivariate probabilistic time series forecasting which samples from the data distribution at each time step by estimating its gradient. To this end, we use diffusion…
View article: Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows
Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows Open
Time series forecasting is often fundamental to scientific and engineering problems and enables decision making. With ever increasing data set sizes, a trivial solution to scale up predictions is to assume independence between interacting …
View article: Markov Chain Importance Sampling – a highly efficient estimator for MCMC
Markov Chain Importance Sampling – a highly efficient estimator for MCMC Open
Markov chain (MC) algorithms are ubiquitous in machine learning and statistics and many other disciplines. Typically, these algorithms can be formulated as acceptance rejection methods. In this work we present a novel estimator applicable …
View article: A Rigorous Theory of Conditional Mean Embeddings
A Rigorous Theory of Conditional Mean Embeddings Open
Conditional mean embeddings (CMEs) have proven themselves to be a powerful\ntool in many machine learning applications. They allow the efficient\nconditioning of probability distributions within the corresponding reproducing\nkernel Hilber…
View article: Set Flow: A Permutation Invariant Normalizing Flow
Set Flow: A Permutation Invariant Normalizing Flow Open
We present a generative model that is defined on finite sets of exchangeable, potentially high dimensional, data. As the architecture is an extension of RealNVPs, it inherits all its favorable properties, such as being invertible and allow…
View article: Kernel Conditional Density Operators
Kernel Conditional Density Operators Open
We introduce a novel conditional density estimation model termed the conditional density operator (CDO). It naturally captures multivariate, multimodal output densities and shows performance that is competitive with recent neural condition…
View article: Markov Chain Importance Sampling -- a highly efficient estimator for MCMC
Markov Chain Importance Sampling -- a highly efficient estimator for MCMC Open
Markov chain (MC) algorithms are ubiquitous in machine learning and statistics and many other disciplines. Typically, these algorithms can be formulated as acceptance rejection methods. In this work we present a novel estimator applicable …
View article: Analyzing high-dimensional time-series data using kernel transfer operator eigenfunctions
Analyzing high-dimensional time-series data using kernel transfer operator eigenfunctions Open
Kernel transfer operators, which can be regarded as approximations of transfer operators such as the Perron-Frobenius or Koopman operator in reproducing kernel Hilbert spaces, are defined in terms of covariance and cross-covariance operato…
View article: Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system
Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system Open
We consider the application of active subspaces to inform a Metropolis-Hastings algorithm, thereby aggressively reducing the computational dimension of the sampling problem. We show that the original formulation, as proposed by Constantine…
View article: Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces
Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces Open
Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems. The eigenfunctions of these operators can be used to detect metastable sets, to project the dy…
View article: Probabilistic models of natural language semantics
Probabilistic models of natural language semantics Open
This thesis tackles the problem of modeling the semantics of natural language. Neural Network models are reviewed and a new Bayesian approach is developed and evaluated. As the performance of standard Monte Carlo algorithms proofed to be u…
View article: Kernel Adaptive Sequential Monte Carlo
Kernel Adaptive Sequential Monte Carlo Open
We develop and study the Kernel Adaptive SMC (Sequential Monte Carlo) Sampler - KASS. KASS builds on the adaptive Sequential Monte Carlo (ASMC) sampler by Fearnhead (2013) marrying it with the kernel-based MCMC rejuvenation step based on S…
View article: Gradient Importance Sampling
Gradient Importance Sampling Open
Adaptive Monte Carlo schemes developed over the last years usually seek to ensure ergodicity of the sampling process in line with MCMC tradition. This poses constraints on what is possible in terms of adaptation. In the general case ergodi…
View article: Consistency of Importance Sampling estimates based on dependent sample sets and an application to models with factorizing likelihoods
Consistency of Importance Sampling estimates based on dependent sample sets and an application to models with factorizing likelihoods Open
In this paper, I proof that Importance Sampling estimates based on dependent sample sets are consistent under certain conditions. This can be used to reduce variance in Bayesian Models with factorizing likelihoods, using sample sets that a…