Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 14:55
04 May 2020

Least squares regression (LSR) has two main issues that greatly limits the improvement of performance: 1) The target matrix is too rigid to learn a discriminative projection matrix leading to a large regression error; 2) the underlying geometric structure is often ignored to obtain correlations among training samples leading to the overfitting problem. To solve these dilemmas, this paper presents a discriminant and sparsity based least squares regression with l_1 regularization (DS_LSR). In DS_LSR, the sparse coefficient matrix of the training data with l_1 regularization is jointly learned with the projection matrix to make the projection matrix discriminative. In addition, an orthogonal relaxed term is introduced to hold the structure of regression targets while relaxing the rigid label matrix. Extensive experimental results demonstrate the effectiveness of the proposed method in classification accuracy.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00