ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.09938
12
23

A Feature Selection Based on Perturbation Theory

26 February 2019
J. R. Anaraki
Hamid Usefi
    AAML
ArXivPDFHTML
Abstract

Consider a supervised dataset D=[A∣b]D=[A\mid \textbf{b}]D=[A∣b], where b\textbf{b}b is the outcome column, rows of DDD correspond to observations, and columns of AAA are the features of the dataset. A central problem in machine learning and pattern recognition is to select the most important features from DDD to be able to predict the outcome. In this paper, we provide a new feature selection method where we use perturbation theory to detect correlations between features. We solve AX=bAX=\textbf{b}AX=b using the method of least squares and singular value decomposition of AAA. In practical applications, such as in bioinformatics, the number of rows of AAA (observations) are much less than the number of columns of AAA (features). So we are dealing with singular matrices with big condition numbers. Although it is known that the solutions of least square problems in singular case are very sensitive to perturbations in AAA, our novel approach in this paper is to prove that the correlations between features can be detected by applying perturbations to AAA. The effectiveness of our method is verified by performing a series of comparisons with conventional and novel feature selection methods in the literature. It is demonstrated that in most situations, our method chooses considerably less number of features while attaining or exceeding the accuracy of the other methods.

View on arXiv
Comments on this paper