ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.05508
  4. Cited By
Noether: The More Things Change, the More Stay the Same

Noether: The More Things Change, the More Stay the Same

12 April 2021
Grzegorz Gluch
R. Urbanke
ArXivPDFHTML

Papers citing "Noether: The More Things Change, the More Stay the Same"

7 / 7 papers shown
Title
TeleSparse: Practical Privacy-Preserving Verification of Deep Neural Networks
TeleSparse: Practical Privacy-Preserving Verification of Deep Neural Networks
Mohammad Maheri
Hamed Haddadi
Alex Davidson
74
0
0
27 Apr 2025
Symmetries, flat minima, and the conserved quantities of gradient flow
Symmetries, flat minima, and the conserved quantities of gradient flow
Bo Zhao
I. Ganev
Robin Walters
Rose Yu
Nima Dehmamy
47
16
0
31 Oct 2022
Symmetry Teleportation for Accelerated Optimization
Symmetry Teleportation for Accelerated Optimization
B. Zhao
Nima Dehmamy
Robin Walters
Rose Yu
ODL
23
20
0
21 May 2022
Noether Networks: Meta-Learning Useful Conserved Quantities
Noether Networks: Meta-Learning Useful Conserved Quantities
Ferran Alet
Dylan D. Doblar
Allan Zhou
J. Tenenbaum
Kenji Kawaguchi
Chelsea Finn
73
26
0
06 Dec 2021
Global optimality conditions for deep neural networks
Global optimality conditions for deep neural networks
Chulhee Yun
S. Sra
Ali Jadbabaie
128
117
0
08 Jul 2017
A Differential Equation for Modeling Nesterov's Accelerated Gradient
  Method: Theory and Insights
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
Weijie Su
Stephen P. Boyd
Emmanuel J. Candes
108
1,154
0
04 Mar 2015
Norm-Based Capacity Control in Neural Networks
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
125
577
0
27 Feb 2015
1