ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.04972
67
26
v1v2v3v4v5v6 (latest)

On-chip Few-shot Learning with Surrogate Gradient Descent on a Neuromorphic Processor

11 October 2019
Kenneth Stewart
Emre Neftci
S. Shrestha
Emre Neftci
    BDL
ArXiv (abs)PDFHTML
Abstract

Recent work suggests that synaptic plasticity dynamics in biological models of neurons and neuromorphic hardware are compatible with gradient-based learning (Neftci_et. al, 19). Gradient-based learning requires iterating several times over a dataset, which is both time-consuming and constrains the training samples to be independently and identically distributed. This is incompatible with learning systems that do not have boundaries between training and inference, such as in neuromorphic hardware. One approach to overcome these constraints is transfer learning, where a portion of the network is pre-trained and mapped into hardware and the remaining portion is trained online. Transfer learning has the advantage that training can be accelerated offline if the task domain is known, and few samples of each class are sufficient for learning at reasonable accuracies. Here, we demonstrate on-line surrogate gradient few-shot learning on the Loihi neuromorphic processor using features pre-trained with spike-based gradient backpropagation-through-time. Our experimental results show that the Loihi chip can learn gestures online using a small number of shots and achieve results that are comparable to the models simulated on a conventional computer.

View on arXiv
Comments on this paper