ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.07838
28
4

Towards the Fundamental Limits of Knowledge Transfer over Finite Domains

11 October 2023
Qingyue Zhao
Banghua Zhu
ArXivPDFHTML
Abstract

We characterize the statistical efficiency of knowledge transfer through nnn samples from a teacher to a probabilistic student classifier with input space S\mathcal SS over labels A\mathcal AA. We show that privileged information at three progressive levels accelerates the transfer. At the first level, only samples with hard labels are known, via which the maximum likelihood estimator attains the minimax rate ∣S∣∣A∣/n\sqrt{{|{\mathcal S}||{\mathcal A}|}/{n}}∣S∣∣A∣/n​. The second level has the teacher probabilities of sampled labels available in addition, which turns out to boost the convergence rate lower bound to ∣S∣∣A∣/n{{|{\mathcal S}||{\mathcal A}|}/{n}}∣S∣∣A∣/n. However, under this second data acquisition protocol, minimizing a naive adaptation of the cross-entropy loss results in an asymptotically biased student. We overcome this limitation and achieve the fundamental limit by using a novel empirical variant of the squared error logit loss. The third level further equips the student with the soft labels (complete logits) on A{\mathcal A}A given every sampled input, thereby provably enables the student to enjoy a rate ∣S∣/n{|{\mathcal S}|}/{n}∣S∣/n free of ∣A∣|{\mathcal A}|∣A∣. We find any Kullback-Leibler divergence minimizer to be optimal in the last case. Numerical simulations distinguish the four learners and corroborate our theory.

View on arXiv
Comments on this paper