ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.04614
28
0

Densely Connected GGG-invariant Deep Neural Networks with Signed Permutation Representations

8 March 2023
Devanshu Agrawal
James Ostrowski
    AI4CE
ArXivPDFHTML
Abstract

We introduce and investigate, for finite groups GGG, GGG-invariant deep neural network (GGG-DNN) architectures with ReLU activation that are densely connected-- i.e., include all possible skip connections. In contrast to other GGG-invariant architectures in the literature, the preactivations of theGGG-DNNs presented here are able to transform by \emph{signed} permutation representations (signed perm-reps) of GGG. Moreover, the individual layers of the GGG-DNNs are not required to be GGG-equivariant; instead, the preactivations are constrained to be GGG-equivariant functions of the network input in a way that couples weights across all layers. The result is a richer family of GGG-invariant architectures never seen previously. We derive an efficient implementation of GGG-DNNs after a reparameterization of weights, as well as necessary and sufficient conditions for an architecture to be ``admissible''-- i.e., nondegenerate and inequivalent to smaller architectures. We include code that allows a user to build a GGG-DNN interactively layer-by-layer, with the final architecture guaranteed to be admissible. We show that there are far more admissible GGG-DNN architectures than those accessible with the ``concatenated ReLU'' activation function from the literature. Finally, we apply GGG-DNNs to two example problems -- (1) multiplication in {−1,1}\{-1, 1\}{−1,1} (with theoretical guarantees) and (2) 3D object classification -- % finding that the inclusion of signed perm-reps significantly boosts predictive performance compared to baselines with only ordinary (i.e., unsigned) perm-reps.

View on arXiv
Comments on this paper