ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.02206
581
118
v1v2v3v4v5 (latest)

A Minimax Approach to Supervised Learning

Neural Information Processing Systems (NeurIPS), 2016
7 June 2016
Farzan Farnia
David Tse
ArXiv (abs)PDFHTML
Abstract

Given a task of predicting YYY from XXX, a loss function LLL, and a set of probability distributions Γ\GammaΓ on (X,Y)(X,Y)(X,Y), what is the optimal decision rule minimizing the worst-case expected loss over Γ\GammaΓ? In this paper, we address this question by introducing a generalization of the principle of maximum entropy. Applying this principle to sets of distributions with marginal on XXX constrained to be the empirical marginal from the data, we develop a general minimax approach for supervised learning problems which reduces to the maximum likelihood problem over generalized linear models. Through this framework, we develop two classification algorithms called the minimax SVM and the minimax Brier classifier. The minimax SVM, which is a relaxed version of the standard SVM, minimizes the worst-case 0-1 loss over the structured set of distribution, and by our numerical experiments can outperform the SVM. The minimax Brier classifier utilizes the Huber penalty function for a robust classification. We also explore the application of the developed framework on robust feature selection.

View on arXiv
Comments on this paper