ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.08389
22
255

Equivariance Through Parameter-Sharing

27 February 2017
Siamak Ravanbakhsh
J. Schneider
Barnabás Póczós
ArXivPDFHTML
Abstract

We propose to study equivariance in deep neural networks through parameter symmetries. In particular, given a group G\mathcal{G}G that acts discretely on the input and output of a standard neural network layer ϕW:ℜM→ℜN\phi_{W}: \Re^{M} \to \Re^{N}ϕW​:ℜM→ℜN, we show that ϕW\phi_{W}ϕW​ is equivariant with respect to G\mathcal{G}G-action iff G\mathcal{G}G explains the symmetries of the network parameters WWW. Inspired by this observation, we then propose two parameter-sharing schemes to induce the desirable symmetry on WWW. Our procedures for tying the parameters achieve G\mathcal{G}G-equivariance and, under some conditions on the action of G\mathcal{G}G, they guarantee sensitivity to all other permutation groups outside G\mathcal{G}G.

View on arXiv
Comments on this paper