Skip to main content

PREFALLKD: PRE-IMPACT FALL DETECTION VIA CNN-VIT KNOWLEDGE DISTILLATION

Tin-Han Chi (Department of Biomedical Engineering, National Yang Ming Chiao Tung University); Kai-Chun Liu (Academia Sinica); Chia-Yeh Hsieh (Bachelor’s Program in Medical Informatics and Innovative Applications, Fu Jen Catholic University); Yu Tsao (Academia Sinica); Chia-Tai Chan (Department of Biomedical Engineering, National Yang Ming Chiao Tung University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Fall accidents are critical issues in an aging and aged society. Recently, many researchers developed “pre-impact fall detection systems” using deep learning to support wearable-based fall protection systems for preventing severe injuries. However, most works only employed simple neural network models instead of complex models considering the usability in resource-constrained mobile devices and strict latency requirements. In this work, we propose a novel pre-impact fall detection via CNN-ViT knowledge distillation, namely PreFallKD, to strike a balance between detection performance and computational complexity. The proposed PreFallKD transfers the detection knowledge from the pre-trained teacher model (vision transformer) to the student model (lightweight convolutional neural networks). Additionally, we apply data augmentation techniques to tackle issues of data imbalance. We conduct the experiment on the KFall public dataset and compare PreFallKD with other state-of-the-art models. The experiment results show that PreFallKD could boost the student model during the testing phase and achieves reliable F1-score (92.66%) and lead time (551.3 ms).

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00