311

Learning to Optimize with Hidden Constraints

Abstract

We consider a data-driven framework for learning to generate decisions to instances of continuous optimization problems where the feasible set varies with an instance-specific auxiliary input. We use a data set of inputs and feasible solutions, as well as an oracle of feasibility, to iteratively train two machine learning models. The first model is a binary classifier for feasibility, which then serves as a barrier function to train the second model via an interior point method. We develop theory and optimality guarantees for interior point methods when given a barrier that relaxes the feasible set, and extend these results to obtain probabilistic out-of-sample guarantees for our learning framework. Finally, we implement our method on a radiation therapy treatment planning problem to predict personalized treatments for head-and-neck cancer patients.

View on arXiv
Comments on this paper