Brain networks rely on precise spike timing and coordinated activity to support robust and energy-efficient learning. Inspired by these principles, spiking neural networks (SNNs) are widely regarded as promising candidates for low-power, event-driven computing. However, most biologically-inspired learning rules employed in SNNs, including spike-timing-dependent plasticity (STDP), rely on isolated spike pairs and lack sensitivity to population-level activity. This limits their stability and generalization, particularly in noisy and fast-changing environments. Motivated by biological observations that neural synchrony plays a central role in learning and memory, we introduce a spike-synchrony-dependent plasticity (SSDP) rule that adjusts synaptic weights based on the degree of coordinated firing among neurons. SSDP supports stable and scalable learning by encouraging neurons to form coherent activity patterns. One prominent outcome is a sudden transition from unstable to stable dynamics during training, suggesting that synchrony may drive convergence toward equilibrium firing regimes. We demonstrate SSDP's effectiveness across multiple network types, from minimal-layer models to spiking ResNets and SNN-Transformer. To our knowledge, this is the first application of a synaptic plasticity mechanism in a spiking transformer. SSDP operates in a fully event-driven manner and incurs minimal computational cost, making it well-suited for neuromorphic deployment. In this approach, local synaptic modifications are associated with the collective dynamics of neural networks, resulting in a learning strategy that adheres to biological principles while maintaining practical efficiency, these findings position SSDP as a general-purpose optimization strategy for SNNs, while offering new insights into population-based learning mechanisms in the brain.
View on arXiv@article{tian2025_2505.14841, title={ Beyond Pairwise Plasticity: Group-Level Spike Synchrony Facilitates Efficient Learning in Spiking Neural Networks }, author={ Yuchen Tian and Assel Kembay and Nhan Duy Truong and Jason K. Eshraghian and Omid Kavehei }, journal={arXiv preprint arXiv:2505.14841}, year={ 2025 } }