ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.10272
25
0

Spike-timing-dependent Hebbian learning as noisy gradient descent

15 May 2025
Niklas Dexheimer
Sascha Gaudlitz
Johannes Schmidt-Hieber
ArXivPDFHTML
Abstract

Hebbian learning is a key principle underlying learning in biological neural networks. It postulates that synaptic changes occur locally, depending on the activities of pre- and postsynaptic neurons. While Hebbian learning based on neuronal firing rates is well explored, much less is known about learning rules that account for precise spike-timing. We relate a Hebbian spike-timing-dependent plasticity rule to noisy gradient descent with respect to a natural loss function on the probability simplex. This connection allows us to prove that the learning rule eventually identifies the presynaptic neuron with the highest activity. We also discover an intrinsic connection to noisy mirror descent.

View on arXiv
@article{dexheimer2025_2505.10272,
  title={ Spike-timing-dependent Hebbian learning as noisy gradient descent },
  author={ Niklas Dexheimer and Sascha Gaudlitz and Johannes Schmidt-Hieber },
  journal={arXiv preprint arXiv:2505.10272},
  year={ 2025 }
}
Comments on this paper