ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.16749
492
7
v1v2v3v4v5 (latest)

Inferring stochastic low-rank recurrent neural networks from neural data

24 June 2024
Matthijs Pals
A Erdem Sağtekin
Felix Pei
Manuel Gloeckler
Jakob H Macke
ArXiv (abs)PDFHTML
Abstract

A central aim in computational neuroscience is to relate the activity of large populations of neurons to an underlying dynamical system. Models of these neural dynamics should ideally be both interpretable and fit the observed data well. Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics. However, it is unclear how to best fit low-rank RNNs to data consisting of noisy observations of an underlying stochastic system. Here, we propose to fit stochastic low-rank RNNs with variational sequential Monte Carlo methods. We validate our method on several datasets consisting of both continuous and spiking neural data, where we obtain lower dimensional latent dynamics than current state of the art methods. Additionally, for low-rank models with piecewise linear nonlinearities, we show how to efficiently identify all fixed points in polynomial rather than exponential cost in the number of units, making analysis of the inferred dynamics tractable for large RNNs. Our method both elucidates the dynamical systems underlying experimental recordings and provides a generative model whose trajectories match observed variability.

View on arXiv
@article{pals2025_2406.16749,
  title={ Inferring stochastic low-rank recurrent neural networks from neural data },
  author={ Matthijs Pals and A Erdem Sağtekin and Felix Pei and Manuel Gloeckler and Jakob H Macke },
  journal={arXiv preprint arXiv:2406.16749},
  year={ 2025 }
}
Comments on this paper