Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:14:47
10 Jun 2021

The typical problem like insufficient training instances in time series classification task demands novel deep neural network architecture to warrant consistent and accurate performance. Deep Residual Network (ResNet) learns through H(x)=F(x)+x, where F(x) is a nonlinear function. We propose Blend-Res2Net that blends two different representation spaces: H^1 (x)=F(x)+Trans(x) and H^2 (x)=F(Trans(x))+x with the intention of learning over richer representation by capturing the temporal as well as the spectral signatures (Trans(∙) represents the transformation function). The sophistication of richer representation for better understanding of the complex structure of time series signals, may result in higher generalization loss. Hence, the deep network complexity is adapted by proposed novel restrained learning, which introduces dynamic estimation of the network depth. The efficacy of Blend-Res2Net is demonstrated by a series of ablation experiments over publicly available benchmark time series archive- UCR. We further establish the superior performance of Blend-Res2Net over baselines and state-of-the-art algorithms including 1-NN-DTW, HIVE-COTE, ResNet, InceptionTime, ROCKET, DMS-CNN, TS-Chief.

Chairs:
Tommy Sonne Alstrøm

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00