ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1601.00720
  4. Cited By
How do neurons operate on sparse distributed representations? A
  mathematical theory of sparsity, neurons and active dendrites
v1v2 (latest)

How do neurons operate on sparse distributed representations? A mathematical theory of sparsity, neurons and active dendrites

5 January 2016
Subutai Ahmad
J. Hawkins
ArXiv (abs)PDFHTML

Papers citing "How do neurons operate on sparse distributed representations? A mathematical theory of sparsity, neurons and active dendrites"

15 / 15 papers shown
Title
Enhancing Biologically Inspired Hierarchical Temporal Memory with Hardware-Accelerated Reflex Memory
Enhancing Biologically Inspired Hierarchical Temporal Memory with Hardware-Accelerated Reflex Memory
Pavia Bera
Sabrina Hassan Moon
Jennifer Adorno
Dayane Alfenas Reis
Sanjukta Bhanja
53
0
0
01 Apr 2025
Unsupervised Cognition
Unsupervised Cognition
Alfredo Ibias
Hector Antona
Guillem Ramirez-Miranda
Enric Guinovart
Eduard Alarcon
SSL
50
2
0
27 Sep 2024
From Manifestations to Cognitive Architectures: a Scalable Framework
From Manifestations to Cognitive Architectures: a Scalable Framework
Alfredo Ibias
Guillem Ramirez-Miranda
Enric Guinovart
Eduard Alarcon
GNN
63
3
0
14 Jun 2024
Sparsity in Continuous-Depth Neural Networks
Sparsity in Continuous-Depth Neural Networks
H. Aliee
Till Richter
Mikhail Solonin
I. Ibarra
Fabian J. Theis
Niki Kilbertus
97
11
0
26 Oct 2022
Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in
  Dynamic Environments
Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments
A. Iyer
Karan Grewal
Akash Velu
Lucas O. Souza
Jérémy Forest
Subutai Ahmad
AI4CE
103
46
0
31 Dec 2021
Continual Learning for Recurrent Neural Networks: an Empirical
  Evaluation
Continual Learning for Recurrent Neural Networks: an Empirical Evaluation
Andrea Cossu
Antonio Carta
Vincenzo Lomonaco
D. Bacciu
CLL
104
111
0
12 Mar 2021
The Neural Coding Framework for Learning Generative Models
The Neural Coding Framework for Learning Generative Models
Alexander Ororbia
Daniel Kifer
GAN
104
68
0
07 Dec 2020
End-to-End Memristive HTM System for Pattern Recognition and Sequence
  Prediction
End-to-End Memristive HTM System for Pattern Recognition and Sequence Prediction
Abdullah M. Zyarah
Kevin D. Gomez
Dhireesha Kudithipudi
27
12
0
22 Jun 2020
How Can We Be So Dense? The Benefits of Using Highly Sparse
  Representations
How Can We Be So Dense? The Benefits of Using Highly Sparse Representations
Subutai Ahmad
Luiz Scheinkman
91
97
0
27 Mar 2019
Theory of Generative Deep Learning : Probe Landscape of Empirical Error
  via Norm Based Capacity Control
Theory of Generative Deep Learning : Probe Landscape of Empirical Error via Norm Based Capacity Control
Wendi Xu
Ming Zhang
29
4
0
03 Oct 2018
The observer-assisted method for adjusting hyper-parameters in deep
  learning algorithms
The observer-assisted method for adjusting hyper-parameters in deep learning algorithms
Maciej Wielgosz
21
1
0
30 Nov 2016
Real-Time Anomaly Detection for Streaming Analytics
Real-Time Anomaly Detection for Streaming Analytics
Subutai Ahmad
S. Purdy
AI4TS
35
99
0
08 Jul 2016
Encoding Data for HTM Systems
Encoding Data for HTM Systems
S. Purdy
AI4CE
51
69
0
18 Feb 2016
Continuous online sequence learning with an unsupervised neural network
  model
Continuous online sequence learning with an unsupervised neural network model
Yuwei Cui
Subutai Ahmad
J. Hawkins
CLL
48
240
0
17 Dec 2015
Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in
  Neocortex
Why Neurons Have Thousands of Synapses, A Theory of Sequence Memory in Neocortex
J. Hawkins
Subutai Ahmad
64
410
0
31 Oct 2015
1