Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 14:34
04 May 2020

Various supervised embedded methods have been proposed to select discriminative features from original ones, such as Feature Selection with Orthogonal Regression (FSOR) and Robust Feature Selection. Compared with embedded methods based on the least square regression, FSOR, utilizing orthogonal regression, can preserve more discriminative information in the subspace and have better performance on feature selection. However, the embedded approaches have scarcely considered the dependency among the selected feature subset. To address the defect, in this paper, we propose a two-stage (filter-embedded) feature selection technique based on Maximum Relevance Minimum Redundancy and FSOR, termed as Orthogonal Regression with Minimum Redundancy (ORMR). We compared the feature selection performance between ORMR and nine other state-of-the-art supervised feature selection methods on six benchmark datasets. The results demonstrate the advantage of ORMR method over others in choosing discriminative features with considering the redundant information among the selected feature subset.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00