Characterizing the impact of noise on neural computation

Published on ● Video Link: https://www.youtube.com/watch?v=tcY0MZQGqWE



Duration: 32:38
62 views
0


Alex Williams (NYU, Flatiron Institute)
https://simons.berkeley.edu/talks/alex-williams-nyu-flatiron-institute-2024-06-03
Understanding Lower-Level Intelligence from AI; Psychology; and Neuroscience Perspectives

Noise is a ubiquitous property of biological neural circuits—trial-to-trial variance in neural spike counts is often equal or greater in magnitude to the mean neural response. Noise has also been incorporated into artificial neural networks; for example, as a form of regularization. But the detailed impact of noise on neural computations and hidden layer representations remains poorly understood. In neuroscience, a primary challenge has been to accurately estimate the statistics of noise with limited trials. To address this, we introduce a statistical model that leverages smoothness in experimental paradigms to derive efficient estimates of trial-to-trial noise covariance. Furthermore, to compare the structure of noise between artificial and biological networks, we propose a novel measure of neural representational similarity for stochastic networks. Together, these analytic tools enable new lines of investigation into non-deterministic modes of neural network function.







Tags:
Simons Institute
theoretical computer science
UC Berkeley
Computer Science
Theory of Computation
Theory of Computing
Understanding Lower-Level Intelligence from AI; Psychology; and Neuroscience Perspectives
Alex Williams