ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.01454
16
0

Indirect Active Learning

3 June 2022
Shashank Singh
ArXivPDFHTML
Abstract

Traditional models of active learning assume a learner can directly manipulate or query a covariate XXX in order to study its relationship with a response YYY. However, if XXX is a feature of a complex system, it may be possible only to indirectly influence XXX by manipulating a control variable ZZZ, a scenario we refer to as Indirect Active Learning. Under a nonparametric model of Indirect Active Learning with a fixed budget, we study minimax convergence rates for estimating the relationship between XXX and YYY locally at a point, obtaining different rates depending on the complexities and noise levels of the relationships between ZZZ and XXX and between XXX and YYY. We also identify minimax rates for passive learning under comparable assumptions. In many cases, our results show that, while there is an asymptotic benefit to active learning, this benefit is fully realized by a simple two-stage learner that runs two passive experiments in sequence.

View on arXiv
Comments on this paper