Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 14:16
04 May 2020

We introduce a formalism that justifies, extends and partially corrects a recent heuristic method for algebraically decoupling deterministic features that proved to be useful for improving feature-based classification. Our new formalism is based on defining transformations inside manifolds, by following trajectories along the features’ gradients. Through these transformations we define a normalization that, we demonstrate, allows for decoupling differentiable features. By applying this to sampling moments, we obtain a quasi-analytic solution for the orthokurtosis, a modification of the kurtosis that is not just decoupled from mean and variance, but also from skewness. After theoretically motivating feature decoupling for analyzing random data, we illustrate with a regression problem example how decoupled features may perform significantly better than coupled ones.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00