6
236

On the Universality of Invariant Networks

Abstract

Constraining linear layers in neural networks to respect symmetry transformations from a group GG is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where GSnG\leq S_n (an arbitrary subgroup of the symmetric group) that acts on Rn\mathbb{R}^n by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, GG-invariant networks are universal if high-order tensors are allowed. Second, there are groups GG for which higher-order tensors are unavoidable for obtaining universality. GG-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of GG-invariant networks that incorporate only first-order tensors.

View on arXiv
Comments on this paper