Skip to main content

Progressive Knowledge Distillation For Early Action Recognition

Vinh Tran, Niranjan Balasubramanian, Minh Hoai Nguyen

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:15
20 Sep 2021

We present a novel framework to train a recurrent neural network for early recognition of human action, which is an important but challenging task given the need to recognize an on-going action based on partial observation. Our framework is based on knowledge distillation, where the network for early recognition is a student model, and it is trained by distilling the knowledge from a teacher model that has superior knowledge by peeking into the future and incorporating extra observations about the action in consideration. This framework can be used in both supervised and semi-supervised learning settings, being able to utilize both the labeled and unlabeled training data. Experiments on the UCF101, SYSU 3DHOI, and NTU RGB-D datasets show the effectiveness of knowledge distillation for early recognition, including situations where we only have a small amount of annotated training data.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00