ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.00673
17
0
v1v2v3 (latest)

GroSS: Group-Size Series Decomposition for Grouped Architecture Search

2 December 2019
Henry Howard-Jenkins
Yiwen Li
V. Prisacariu
ArXiv (abs)PDFHTML
Abstract

We present a novel approach which is able to explore the configuration of grouped convolutions within neural networks. Group-size Series (GroSS) decomposition is a mathematical formulation of tensor factorisation into a series of approximations of increasing rank terms. GroSS allows for dynamic and differentiable selection of factorisation rank, which is analogous to a grouped convolution. Therefore, to the best of our knowledge, GroSS is the first method to enable simultaneously train differing numbers of groups within a single layer, as well as all possible combinations between layers. In doing so, GroSS is able to train an entire grouped convolution architecture search-space concurrently. We demonstrate this through architecture searches with performance objectives and evaluate its performance against conventional Block Term Decomposition. GroSS enables more effective and efficient search for grouped convolutional architectures.

View on arXiv
Comments on this paper