46
4

A neuromorphic boost to RNNs using low pass filters

Abstract

The increasing difficulty with Moore's law scaling and the remarkable success of machine learning have triggered a renaissance in the study of low-latency, energy-efficient accelerators for machine learning applications. In particular, spiking neural networks (SNNs) and their neuromorphic hardware implementations have started to receive substantial attention from academia and industry. However, SNNs perform relatively poorly compared to their rate-based counterparts, in terms of accuracy in pattern recognition tasks. In this paper, we show that the addition of a low pass filtering term to a recurrent neural network enables mapping to neuromorphic SNN devices. This breakthrough will allow the implementation of much more sophisticated SNN models on neuromorphic hardware than the state of the art. The use of low-power neuromorphic platforms will, in turn, enable the construction of compact devices that can perform always-on processing in ultra-low power edge computing applications. We further show that the low pass filtered RNN cell matches and outperforms their unfiltered variants in a range of tasks. Finally, we argue that the low pass filter is a temporal regularizer, allowing RNNs to learn and generalise better than their unfiltered variants.

View on arXiv
Comments on this paper