ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.09297
  4. Cited By
A Framework for Inference Inspired by Human Memory Mechanisms

A Framework for Inference Inspired by Human Memory Mechanisms

1 October 2023
Xiangyu Zeng
Jie Lin
Piao Hu
Ruizheng Huang
Zhicheng Zhang
ArXivPDFHTML

Papers citing "A Framework for Inference Inspired by Human Memory Mechanisms"

15 / 15 papers shown
Title
A Brain-inspired Memory Transformation based Differentiable Neural
  Computer for Reasoning-based Question Answering
A Brain-inspired Memory Transformation based Differentiable Neural Computer for Reasoning-based Question Answering
Yao Liang
H. Fang
Yi Zeng
Feifei Zhao
OOD
46
2
0
07 Jan 2023
Recurrent Memory Transformer
Recurrent Memory Transformer
Aydar Bulatov
Yuri Kuratov
Andrey Kravchenko
CLL
39
108
0
14 Jul 2022
Perceiver: General Perception with Iterative Attention
Perceiver: General Perception with Iterative Attention
Andrew Jaegle
Felix Gimeno
Andrew Brock
Andrew Zisserman
Oriol Vinyals
João Carreira
VLM
ViT
MDE
168
1,014
0
04 Mar 2021
Hopfield Networks is All You Need
Hopfield Networks is All You Need
Hubert Ramsauer
Bernhard Schafl
Johannes Lehner
Philipp Seidl
Michael Widrich
...
David P. Kreil
Michael K Kopp
Günter Klambauer
Johannes Brandstetter
Sepp Hochreiter
83
433
0
16 Jul 2020
Longformer: The Long-Document Transformer
Longformer: The Long-Document Transformer
Iz Beltagy
Matthew E. Peters
Arman Cohan
RALM
VLM
133
4,048
0
10 Apr 2020
Efficient Content-Based Sparse Attention with Routing Transformers
Efficient Content-Based Sparse Attention with Routing Transformers
Aurko Roy
M. Saffar
Ashish Vaswani
David Grangier
MoE
296
596
0
12 Mar 2020
Generating Long Sequences with Sparse Transformers
Generating Long Sequences with Sparse Transformers
R. Child
Scott Gray
Alec Radford
Ilya Sutskever
99
1,896
0
23 Apr 2019
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
Zihang Dai
Zhilin Yang
Yiming Yang
J. Carbonell
Quoc V. Le
Ruslan Salakhutdinov
VLM
208
3,724
0
09 Jan 2019
A simple neural network module for relational reasoning
A simple neural network module for relational reasoning
Adam Santoro
David Raposo
David Barrett
Mateusz Malinowski
Razvan Pascanu
Peter W. Battaglia
Timothy Lillicrap
GNN
NAI
170
1,613
0
05 Jun 2017
Improving Neural Language Models with a Continuous Cache
Improving Neural Language Models with a Continuous Cache
Edouard Grave
Armand Joulin
Nicolas Usunier
KELM
46
300
0
13 Dec 2016
Pointer Sentinel Mixture Models
Pointer Sentinel Mixture Models
Stephen Merity
Caiming Xiong
James Bradbury
R. Socher
RALM
282
2,844
0
26 Sep 2016
Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets
Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets
Armand Joulin
Tomas Mikolov
TPM
130
411
0
03 Mar 2015
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
Jason Weston
Antoine Bordes
S. Chopra
Alexander M. Rush
Bart van Merriënboer
Armand Joulin
Tomas Mikolov
LRM
ELM
142
1,181
0
19 Feb 2015
Neural Turing Machines
Neural Turing Machines
Alex Graves
Greg Wayne
Ivo Danihelka
97
2,327
0
20 Oct 2014
Memory Networks
Memory Networks
Jason Weston
S. Chopra
Antoine Bordes
GNN
KELM
145
1,705
0
15 Oct 2014
1