S. Lam
YOU?
Author Swipe
View article: Convergence Analysis of Real-time Recurrent Learning (RTRL) for a class of Recurrent Neural Networks
Convergence Analysis of Real-time Recurrent Learning (RTRL) for a class of Recurrent Neural Networks Open
Recurrent neural networks (RNNs) are commonly trained with the truncated backpropagation-through-time (TBPTT) algorithm. For the purposes of computational tractability, the TBPTT algorithm truncates the chain rule and calculates the gradie…
View article: Weak Convergence Analysis of Online Neural Actor-Critic Algorithms
Weak Convergence Analysis of Online Neural Actor-Critic Algorithms Open
We prove that a single-layer neural network trained with the online actor critic algorithm converges in distribution to a random ordinary differential equation (ODE) as the number of hidden units and the number of training steps $\rightarr…
View article: Deep Neural Network Initialization with Sparsity Inducing Activations
Deep Neural Network Initialization with Sparsity Inducing Activations Open
Inducing and leveraging sparse activations during training and inference is a promising avenue for improving the computational efficiency of deep networks, which is increasingly important as network sizes continue to grow and their applica…
View article: Kernel Limit of Recurrent Neural Networks Trained on Ergodic Data Sequences
Kernel Limit of Recurrent Neural Networks Trained on Ergodic Data Sequences Open
Mathematical methods are developed to characterize the asymptotics of recurrent neural networks (RNN) as the number of hidden units, data samples in the sequence, hidden state updates, and training steps simultaneously grow to infinity. In…