ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.11650
8
216

On the Theory of Transfer Learning: The Importance of Task Diversity

20 June 2020
Nilesh Tripuraneni
Michael I. Jordan
Chi Jin
ArXivPDFHTML
Abstract

We provide new statistical guarantees for transfer learning via representation learning--when transfer is achieved by learning a feature representation shared across different tasks. This enables learning on new tasks using far less data than is required to learn them in isolation. Formally, we consider t+1t+1t+1 tasks parameterized by functions of the form fj∘hf_j \circ hfj​∘h in a general function class F∘H\mathcal{F} \circ \mathcal{H}F∘H, where each fjf_jfj​ is a task-specific function in F\mathcal{F}F and hhh is the shared representation in H\mathcal{H}H. Letting C(⋅)C(\cdot)C(⋅) denote the complexity measure of the function class, we show that for diverse training tasks (1) the sample complexity needed to learn the shared representation across the first ttt training tasks scales as C(H)+tC(F)C(\mathcal{H}) + t C(\mathcal{F})C(H)+tC(F), despite no explicit access to a signal from the feature representation and (2) with an accurate estimate of the representation, the sample complexity needed to learn a new task scales only with C(F)C(\mathcal{F})C(F). Our results depend upon a new general notion of task diversity--applicable to models with general tasks, features, and losses--as well as a novel chain rule for Gaussian complexities. Finally, we exhibit the utility of our general framework in several models of importance in the literature.

View on arXiv
Comments on this paper