ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.11983
  4. Cited By
Benefits of Jointly Training Autoencoders: An Improved Neural Tangent
  Kernel Analysis
v1v2 (latest)

Benefits of Jointly Training Autoencoders: An Improved Neural Tangent Kernel Analysis

27 November 2019
THANH VAN NGUYEN
Raymond K. W. Wong
Chinmay Hegde
ArXiv (abs)PDFHTML

Papers citing "Benefits of Jointly Training Autoencoders: An Improved Neural Tangent Kernel Analysis"

4 / 4 papers shown
Title
Deep Symmetric Autoencoders from the Eckart-Young-Schmidt Perspective
Deep Symmetric Autoencoders from the Eckart-Young-Schmidt Perspective
Simone Brivio
Nicola Rares Franco
19
0
0
13 Jun 2025
High-dimensional Asymptotics of Denoising Autoencoders
High-dimensional Asymptotics of Denoising Autoencoders
Hugo Cui
Lenka Zdeborová
91
15
0
18 May 2023
Fundamental Limits of Two-layer Autoencoders, and Achieving Them with
  Gradient Methods
Fundamental Limits of Two-layer Autoencoders, and Achieving Them with Gradient Methods
Aleksandr Shevchenko
Kevin Kögler
Hamed Hassani
Marco Mondelli
DRLMLT
70
2
0
27 Dec 2022
Toward Understanding the Feature Learning Process of Self-supervised
  Contrastive Learning
Toward Understanding the Feature Learning Process of Self-supervised Contrastive Learning
Zixin Wen
Yuanzhi Li
SSLMLT
94
136
0
31 May 2021
1