ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.13982
  4. Cited By
A Low Complexity Decentralized Neural Net with Centralized Equivalence
  using Layer-wise Learning

A Low Complexity Decentralized Neural Net with Centralized Equivalence using Layer-wise Learning

29 September 2020
Xinyue Liang
Alireza M. Javid
Mikael Skoglund
S. Chatterjee
    FedML
ArXivPDFHTML

Papers citing "A Low Complexity Decentralized Neural Net with Centralized Equivalence using Layer-wise Learning"

2 / 2 papers shown
Title
A ReLU Dense Layer to Improve the Performance of Neural Networks
A ReLU Dense Layer to Improve the Performance of Neural Networks
Alireza M. Javid
Sandipan Das
Mikael Skoglund
S. Chatterjee
26
31
0
22 Oct 2020
Large scale distributed neural network training through online
  distillation
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
1