Multiple Equivalent Solutions for the Lasso

Feature selection is an important problem studied in data analytics seeking to identify a minimal-size feature subset that is optimally predictive for an outcome of interest. It is also a powerful tool in Knowledge Discovery as a means for gaining domain insight, e.g., identifying which medical quantities carry unique information for the disease status. It is arguably less recognized however, that the problem may have multiple, equivalent solutions. In that case, it is misleading to domain experts to report only one of them and ignore all other equivalent solutions. In this paper, we extend a well-established single, feature selection algorithm (i.e., reporting a single solution), namely the Lasso algorithm, to the multiple solution problem based on formalized notion of equivalence for both classification and regression tasks. Empirical results are obtained using a fully automated pipeline called Just Add Data Bio or JAD Bio training and selecting multiple, linear as well as nonlinear learners, optimizing hyper-parameter values, and correcting for the bias of multiple inductions (model selection). The results show that multiple solutions do exist in real datasets, as well as the ability of the algorithm to identify a subset of them. A comparison with the Statistical Equivalent Solutions (SES) algorithm shows that Lasso equivalent solutions have better prediction performance at the cost of selecting more features.
View on arXiv