Chen Zeno
YOU?
Author Swipe
View article: When Diffusion Models Memorize: Inductive Biases in Probability Flow of Minimum-Norm Shallow Neural Nets
When Diffusion Models Memorize: Inductive Biases in Probability Flow of Minimum-Norm Shallow Neural Nets Open
While diffusion models generate high-quality images via probability flow, the theoretical understanding of this process remains incomplete. A key question is when probability flow converges to training samples or more general points on the…
View article: How do Minimum-Norm Shallow Denoisers Look in Function Space?
How do Minimum-Norm Shallow Denoisers Look in Function Space? Open
Neural network (NN) denoisers are an essential building block in many common tasks, ranging from image reconstruction to image generation. However, the success of these models is not well understood from a theoretical perspective. In this …
View article: Task-Agnostic Continual Learning Using Online Variational Bayes with Fixed-Point Updates
Task-Agnostic Continual Learning Using Online Variational Bayes with Fixed-Point Updates Open
Catastrophic forgetting is the notorious vulnerability of neural networks to the changes in the data distribution during learning. This phenomenon has long been considered a major obstacle for using learning agents in realistic continual l…
View article: Task Agnostic Continual Learning Using Online Variational Bayes
Task Agnostic Continual Learning Using Online Variational Bayes Open
Catastrophic forgetting is the notorious vulnerability of neural networks to the change of the data distribution while learning. This phenomenon has long been considered a major obstacle for allowing the use of learning agents in realistic…
View article: Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning.
Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning. Open
We suggest a novel approach for the estimation of the posterior distribution of the weights of a neural network, using an online version of the variational Bayes method. Having a confidence measure of the weights allows to combat several s…