ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.01950
  4. Cited By
Unlocking the potential of two-point cells for energy-efficient and
  resilient training of deep nets
v1v2v3 (latest)

Unlocking the potential of two-point cells for energy-efficient and resilient training of deep nets

24 October 2022
Ahsan Adeel
A. Adetomi
K. Ahmed
Amir Hussain
T. Arslan
William A. Phillips
ArXiv (abs)PDFHTML

Papers citing "Unlocking the potential of two-point cells for energy-efficient and resilient training of deep nets"

20 / 20 papers shown
Title
Context-sensitive neocortical neurons transform the effectiveness and
  efficiency of neural information processing
Context-sensitive neocortical neurons transform the effectiveness and efficiency of neural information processing
Ahsan Adeel
Mario Franco
Mohsin Raza
K. Ahmed
42
9
0
15 Jul 2022
Two Sparsities Are Better Than One: Unlocking the Performance Benefits
  of Sparse-Sparse Networks
Two Sparsities Are Better Than One: Unlocking the Performance Benefits of Sparse-Sparse Networks
Kevin Lee Hunter
Lawrence Spracklen
Subutai Ahmad
65
20
0
27 Dec 2021
Attentive Cross-modal Connections for Deep Multimodal Wearable-based
  Emotion Recognition
Attentive Cross-modal Connections for Deep Multimodal Wearable-based Emotion Recognition
Anubha Bhatti
Behnam Behinaein
D. Rodenburg
Paul Hungler
Ali Etemad
47
21
0
04 Aug 2021
Sparsity in Deep Learning: Pruning and growth for efficient inference
  and training in neural networks
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
314
724
0
31 Jan 2021
The Computational Limits of Deep Learning
The Computational Limits of Deep Learning
Neil C. Thompson
Kristjan Greenewald
Keeheon Lee
Gabriel F. Manso
VLM
53
528
0
10 Jul 2020
Sparse GPU Kernels for Deep Learning
Sparse GPU Kernels for Deep Learning
Trevor Gale
Matei A. Zaharia
C. Young
Erich Elsen
73
234
0
18 Jun 2020
Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge
  Computing
Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge Computing
En Li
Liekang Zeng
Zhi Zhou
Xu Chen
54
630
0
04 Oct 2019
Energy and Policy Considerations for Deep Learning in NLP
Energy and Policy Considerations for Deep Learning in NLP
Emma Strubell
Ananya Ganesh
Andrew McCallum
73
2,660
0
05 Jun 2019
How Can We Be So Dense? The Benefits of Using Highly Sparse
  Representations
How Can We Be So Dense? The Benefits of Using Highly Sparse Representations
Subutai Ahmad
Luiz Scheinkman
67
97
0
27 Mar 2019
Dendritic cortical microcircuits approximate the backpropagation
  algorithm
Dendritic cortical microcircuits approximate the backpropagation algorithm
João Sacramento
Rui Ponte Costa
Yoshua Bengio
Walter Senn
83
312
0
26 Oct 2018
Lip-Reading Driven Deep Learning Approach for Speech Enhancement
Lip-Reading Driven Deep Learning Approach for Speech Enhancement
Ahsan Adeel
M. Gogate
Amir Hussain
W. Whitmer
64
65
0
31 Jul 2018
The Sparse Manifold Transform
The Sparse Manifold Transform
Yubei Chen
Dylan M. Paiton
Bruno A. Olshausen
MedIm
42
57
0
23 Jun 2018
Large-Scale Neuromorphic Spiking Array Processors: A quest to mimic the
  brain
Large-Scale Neuromorphic Spiking Array Processors: A quest to mimic the brain
Chetan Singh Thakur
J. Molin
Gert Cauwenberghs
Giacomo Indiveri
Kundan Kumar
...
Jae-sun Seo
Shimeng Yu
Yu Cao
André van Schaik
R. Etienne-Cummings
65
237
0
23 May 2018
MINE: Mutual Information Neural Estimation
MINE: Mutual Information Neural Estimation
Mohamed Ishmael Belghazi
A. Baratin
Sai Rajeswar
Sherjil Ozair
Yoshua Bengio
Aaron Courville
R. Devon Hjelm
DRL
196
1,282
0
12 Jan 2018
XFlow: Cross-modal Deep Neural Networks for Audiovisual Classification
XFlow: Cross-modal Deep Neural Networks for Audiovisual Classification
Cătălina Cangea
Petar Velickovic
Pietro Lio
42
30
0
02 Sep 2017
A scalable multi-core architecture with heterogeneous memory structures
  for Dynamic Neuromorphic Asynchronous Processors (DYNAPs)
A scalable multi-core architecture with heterogeneous memory structures for Dynamic Neuromorphic Asynchronous Processors (DYNAPs)
S. Moradi
Ning Qiao
F. Stefanini
Giacomo Indiveri
67
483
0
14 Aug 2017
Scalable Training of Artificial Neural Networks with Adaptive Sparse
  Connectivity inspired by Network Science
Scalable Training of Artificial Neural Networks with Adaptive Sparse Connectivity inspired by Network Science
Decebal Constantin Mocanu
Elena Mocanu
Peter Stone
Phuong H. Nguyen
M. Gibescu
A. Liotta
178
634
0
15 Jul 2017
Neuromorphic Hardware In The Loop: Training a Deep Spiking Network on
  the BrainScaleS Wafer-Scale System
Neuromorphic Hardware In The Loop: Training a Deep Spiking Network on the BrainScaleS Wafer-Scale System
Sebastian Schmitt
Johann Klaehn
G. Bellec
Andreas Grübl
Maurice Guettler
...
Robert Legenstein
Wolfgang Maass
Christian Mayr
Johannes Schemmel
K. Meier
46
138
0
06 Mar 2017
The Power of Sparsity in Convolutional Neural Networks
The Power of Sparsity in Convolutional Neural Networks
Soravit Changpinyo
Mark Sandler
A. Zhmoginov
75
133
0
21 Feb 2017
A neuromorphic hardware architecture using the Neural Engineering
  Framework for pattern recognition
A neuromorphic hardware architecture using the Neural Engineering Framework for pattern recognition
Runchun Wang
Chetan Singh Thakur
T. Hamilton
J. Tapson
Andre van Schaik
58
49
0
21 Jul 2015
1