Junmei Yang
YOU?
Author Swipe
View article: Diffusion Secant Alignment for Score-Based Density Ratio Estimation
Diffusion Secant Alignment for Score-Based Density Ratio Estimation Open
Estimating density ratios has become increasingly important with the recent rise of score-based and diffusion-inspired methods. However, current tangent-based approaches rely on a high-variance learning objective, which leads to unstable t…
View article: Dequantified Diffusion-Schr{ö}dinger Bridge for Density Ratio Estimation
Dequantified Diffusion-Schr{ö}dinger Bridge for Density Ratio Estimation Open
Density ratio estimation is fundamental to tasks involving $f$-divergences, yet existing methods often fail under significantly different distributions or inadequately overlapping supports -- the density-chasm and the support-chasm problem…
View article: Variational Learning of Gaussian Process Latent Variable Models through Stochastic Gradient Annealed Importance Sampling
Variational Learning of Gaussian Process Latent Variable Models through Stochastic Gradient Annealed Importance Sampling Open
Gaussian Process Latent Variable Models (GPLVMs) have become increasingly popular for unsupervised tasks such as dimensionality reduction and missing data recovery due to their flexibility and non-linear nature. An importance-weighted vers…
View article: Fully Bayesian Differential Gaussian Processes through Stochastic Differential Equations
Fully Bayesian Differential Gaussian Processes through Stochastic Differential Equations Open
Deep Gaussian process models typically employ discrete hierarchies, but recent advancements in differential Gaussian processes (DiffGPs) have extended these models to infinite depths. However, existing DiffGP approaches often overlook the …
View article: Flexible Bayesian Last Layer Models Using Implicit Priors and Diffusion Posterior Sampling
Flexible Bayesian Last Layer Models Using Implicit Priors and Diffusion Posterior Sampling Open
Bayesian Last Layer (BLL) models focus solely on uncertainty in the output layer of neural networks, demonstrating comparable performance to more complex Bayesian models. However, the use of Gaussian priors for last layer weights in Bayesi…
View article: Neural Operator Variational Inference based on Regularized Stein Discrepancy for Deep Gaussian Processes
Neural Operator Variational Inference based on Regularized Stein Discrepancy for Deep Gaussian Processes Open
Deep Gaussian Process (DGP) models offer a powerful nonparametric approach for Bayesian inference, but exact inference is typically intractable, motivating the use of various approximations. However, existing approaches, such as mean-field…
View article: Bayesian Gaussian Process ODEs via Double Normalizing Flows
Bayesian Gaussian Process ODEs via Double Normalizing Flows Open
Recently, Gaussian processes have been used to model the vector field of continuous dynamical systems, referred to as GPODEs, which are characterized by a probabilistic ODE equation. Bayesian inference for these models has been extensively…
View article: Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks
Speech Dereverberation Based on Improved Wasserstein Generative Adversarial Networks Open
In reality, the sound we hear is not only disturbed by noise, but also the reverberant, whose effects are rarely taken into account. Recently, deep learning has shown great advantages in speech signal processing. But among the existing der…