38
10

Tree Stack Memory Units

Abstract

Generalization to harder compositional problem instances (a.k.a extrapolation) is challenging for standard neural networks. In contrast, recursive neural networks have the potential to achieve extrapolation because they are able to capture the compositionality of tree-structured data such as mathematical equations. However, recursive networks are prone to error propagation along trees of high depth and are unable to capture long range dependencies effectively. To overcome this, we propose Tree Stack Memory Units (Tree-SMUs), a novel memory augmented recursive neural network whose nodes consist of a differentiable stack. Each SMU cell learns to read from its stack and to write to it by combining the stacks and states of its children through gating. This architecture improves both the local and global representation of compositional data due to better expressive power and the ability to capture long-range dependencies by giving each node indirect access to its descendants. We demonstrate strong empirical results on two tasks and show that Tree-SMU enables accurate extrapolation to significantly harder instances.

View on arXiv
Comments on this paper