Eric Shea‐Brown
YOU?
Author Swipe
View article: Temporal Deconvolution of Mesoscale Recordings
Temporal Deconvolution of Mesoscale Recordings Open
Mesoscale calcium imaging techniques, such as wide-field imaging, enable high temporal resolution recordings of extensive neuronal activity across one or more brain regions. However, since the recordings capture light emission generated by…
View article: Global and local origins of trial-to-trial spike count variability in visual cortex
Global and local origins of trial-to-trial spike count variability in visual cortex Open
Sensory neuron spiking responses vary across repeated presentations of the same stimuli, but whether this trial-to-trial variability represents noise versus unidentified signals remains unresolved. Some of the variability can be attributed…
View article: KPFlow: An Operator Perspective on Dynamic Collapse Under Gradient Descent Training of Recurrent Networks
KPFlow: An Operator Perspective on Dynamic Collapse Under Gradient Descent Training of Recurrent Networks Open
Gradient Descent (GD) and its variants are the primary tool for enabling efficient training of recurrent dynamical systems such as Recurrent Neural Networks (RNNs), Neural ODEs and Gated Recurrent units (GRUs). The dynamics that are formed…
View article: Impact of Local Connectivity Patterns on Excitatory-Inhibitory Network Dynamics
Impact of Local Connectivity Patterns on Excitatory-Inhibitory Network Dynamics Open
Networks of excitatory and inhibitory (EI) neurons form a canonical circuit in the brain. Seminal theoretical results on the dynamics of such networks are based on the assumption that synaptic strengths depend on the type of neurons they c…
View article: Data Heterogeneity Limits the Scaling Effect of Pretraining in Neural Data Transformers
Data Heterogeneity Limits the Scaling Effect of Pretraining in Neural Data Transformers Open
A key challenge in analyzing neuroscience datasets is the profound variability they exhibit across sessions, animals, and data modalities–i.e., heterogeneity. Several recent studies have demonstrated performance gains from pretraining neur…
View article: Identifying the impact of local connectivity patterns on dynamics in excitatory-inhibitory networks.
Identifying the impact of local connectivity patterns on dynamics in excitatory-inhibitory networks. Open
Networks of excitatory and inhibitory (EI) neurons form a canonical circuit in the brain. Seminal theoretical results on dynamics of such networks are based on the assumption that synaptic strengths depend on the type of neurons they conne…
View article: Transition to chaos separates learning regimes and relates to measure of consciousness in recurrent neural networks
Transition to chaos separates learning regimes and relates to measure of consciousness in recurrent neural networks Open
Recurrent neural networks exhibit chaotic dynamics when the variance in their connection strengths exceed a critical value. Recent work indicates connection variance also modulates learning strategies; networks learn ”rich” representations…
View article: How connectivity structure shapes rich and lazy learning in neural circuits.
How connectivity structure shapes rich and lazy learning in neural circuits. Open
In theoretical neuroscience, recent work leverages deep learning tools to explore how some network attributes critically influence its learning dynamics. Notably, initial weight distributions with small (resp. large) variance may yield a r…
View article: Evolutionary algorithms as an alternative to backpropagation for supervised training of Biophysical Neural Networks and Neural ODEs
Evolutionary algorithms as an alternative to backpropagation for supervised training of Biophysical Neural Networks and Neural ODEs Open
Training networks consisting of biophysically accurate neuron models could allow for new insights into how brain circuits can organize and solve tasks. We begin by analyzing the extent to which the central algorithm for neural network lear…
View article: Attention for Causal Relationship Discovery from Biological Neural Dynamics
Attention for Causal Relationship Discovery from Biological Neural Dynamics Open
This paper explores the potential of the transformer models for learning Granger causality in networks with complex nonlinear dynamics at every node, as in neurobiological and biophysical networks. Our study primarily focuses on a proof-of…
View article: How connectivity structure shapes rich and lazy learning in neural circuits
How connectivity structure shapes rich and lazy learning in neural circuits Open
In theoretical neuroscience, recent work leverages deep learning tools to explore how some network attributes critically influence its learning dynamics. Notably, initial weight distributions with small (resp. large) variance may yield a r…
View article: Modeling functional cell types in spike train data
Modeling functional cell types in spike train data Open
A major goal of computational neuroscience is to build accurate models of the activity of neurons that can be used to interpret their function in circuits. Here, we explore using functional cell types to refine single-cell models by groupi…
View article: A simple connection from loss flatness to compressed neural representations
A simple connection from loss flatness to compressed neural representations Open
Sharpness, a geometric measure in the parameter space that reflects the flatness of the loss landscape, has long been studied for its potential connections to neural network behavior. While sharpness is often associated with generalization…
View article: Expressive probabilistic sampling in recurrent neural networks
Expressive probabilistic sampling in recurrent neural networks Open
In sampling-based Bayesian models of brain function, neural activities are assumed to be samples from probability distributions that the brain uses for probabilistic computation. However, a comprehensive understanding of how mechanistic mo…
View article: Modeling functional cell types in spike train data
Modeling functional cell types in spike train data Open
A major goal of computational neuroscience is to build accurate models of the activity of neurons that can be used to interpret their function in circuits. Here, we explore using functional cell types to refine single-cell models by groupi…
View article: Heterogeneity in Neuronal Dynamics Is Learned by Gradient Descent for Temporal Processing Tasks
Heterogeneity in Neuronal Dynamics Is Learned by Gradient Descent for Temporal Processing Tasks Open
Individual neurons in the brain have complex intrinsic dynamics that are highly diverse. We hypothesize that the complex dynamics produced by networks of complex and heterogeneous neurons may contribute to the brain's ability to process an…
View article: Learning dynamics of deep linear networks with multiple pathways.
Learning dynamics of deep linear networks with multiple pathways. Open
Not only have deep networks become standard in machine learning, they are increasingly of interest in neuroscience as models of cortical computation that capture relationships between structural and functional properties. In addition they …
View article: MouseNet: A biologically constrained convolutional neural network model for the mouse visual cortex
MouseNet: A biologically constrained convolutional neural network model for the mouse visual cortex Open
Convolutional neural networks trained on object recognition derive inspiration from the neural architecture of the visual system in mammals, and have been used as models of the feedforward computation performed in the primate ventral strea…
View article: A scale-dependent measure of system dimensionality
A scale-dependent measure of system dimensionality Open
A fundamental problem in science is uncovering the effective number of degrees of freedom in a complex system: its dimensionality. A system's dimensionality depends on its spatiotemporal scale. Here, we introduce a scale-dependent generali…
View article: Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rules
Beyond accuracy: generalization properties of bio-plausible temporal credit assignment rules Open
To unveil how the brain learns, ongoing work seeks biologically-plausible approximations of gradient descent algorithms for training recurrent neural networks (RNNs). Yet, beyond task accuracy, it is unclear if such learning rules converge…
View article: Biologically-plausible backpropagation through arbitrary timespans via local neuromodulators
Biologically-plausible backpropagation through arbitrary timespans via local neuromodulators Open
The spectacular successes of recurrent neural network models where key parameters are adjusted via backpropagation-based gradient descent have inspired much thought as to how biological neuronal networks might solve the corresponding synap…
View article: Heterogeneity in Neuronal Dynamics is Learned by Gradient Descent for Temporal Processing Tasks
Heterogeneity in Neuronal Dynamics is Learned by Gradient Descent for Temporal Processing Tasks Open
Individual neurons in the brain have complex intrinsic dynamics that are highly diverse. We hypothesize that the complex dynamics produced by networks of complex and heterogeneous neurons may contribute to the brain’s ability to process an…
View article: Single Circuit in V1 Capable of Switching Contexts During Movement Using an Inhibitory Population as a Switch
Single Circuit in V1 Capable of Switching Contexts During Movement Using an Inhibitory Population as a Switch Open
As animals adapt to their environments, their brains are tasked with processing stimuli in different sensory contexts. Whether these computations are context dependent or independent, they are all implemented in the same neural tissue. A c…
View article: Cell-type–specific neuromodulation guides synaptic credit assignment in a spiking neural network
Cell-type–specific neuromodulation guides synaptic credit assignment in a spiking neural network Open
Significance Synaptic connectivity provides the foundation for our present understanding of neuronal network function, but static connectivity cannot explain learning and memory. We propose a computational role for the diversity of cortica…
View article: A biologically inspired architecture with switching units can learn to generalize across backgrounds
A biologically inspired architecture with switching units can learn to generalize across backgrounds Open
Humans and other animals navigate different landscapes and environments with ease, a feat that requires the brain’s ability to rapidly and accurately adapt to different visual domains, generalizing across contexts/backgrounds. Despite rece…
View article: CNN MouseNet: A biologically constrained convolutional neural network model for mouse visual cortex
CNN MouseNet: A biologically constrained convolutional neural network model for mouse visual cortex Open
Convolutional neural networks trained on object recognition derive inspiration from the neural architecture of the visual system in primates, and have been used as models of the feedforward computation performed in the primate ventral stre…
View article: Identification of Multiple Noise Sources Improves Estimation of Neural Responses across Stimulus Conditions
Identification of Multiple Noise Sources Improves Estimation of Neural Responses across Stimulus Conditions Open
Most models of neural responses are constructed to reproduce the average response to inputs but lack the flexibility to capture observed variability in responses. The origins and structure of this variability have significant implications …