ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.02826
  4. Cited By
Finding the Optimal Dynamic Treatment Regime Using Smooth Fisher
  Consistent Surrogate Loss

Finding the Optimal Dynamic Treatment Regime Using Smooth Fisher Consistent Surrogate Loss

3 November 2021
Nilanjana Laha
Aaron Sonabend-W
Rajarshi Mukherjee
Tianxi Cai
ArXivPDFHTML

Papers citing "Finding the Optimal Dynamic Treatment Regime Using Smooth Fisher Consistent Surrogate Loss"

3 / 3 papers shown
Title
Stage-Aware Learning for Dynamic Treatments
Stage-Aware Learning for Dynamic Treatments
Han Ye
Wenzhuo Zhou
Ruoqing Zhu
Annie Qu
29
1
0
30 Oct 2023
Global Convergence and Stability of Stochastic Gradient Descent
Global Convergence and Stability of Stochastic Gradient Descent
V. Patel
Shushu Zhang
Bowen Tian
33
22
0
04 Oct 2021
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark Schmidt
139
1,205
0
16 Aug 2016
1