Skip to main content

Improving Automatic Sleep Staging via Temporal Smoothness Regularization

Huy Phan (Amazon Alexa); Elisabeth Heremans (KU Leuven); Oliver Y. Chén (University of Bristol); Philipp Koch (University of Luebeck); Alfred Mertins (University of Luebeck); Maarten De Vos (KU Leuven)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

We propose a regularization method, so-called temporal smoothness regularization, for training deep neural networks for automatic sleep staging in small data settings. In intuition, we constrain the cross-entropy losses of any two adjacent epochs in the sequential input to be as close to each other as possible. The regularization closely reflects the slow transition nature of sleep process which implies small information changes between two consecutive sleep epochs. Via the regularization, we essentially discourage the network from overfitting to these small changes. Our experiments show that training the SeqSleepNet base network with the proposed regularization leads to performance improvement over the baseline without the regularization applied. Furthermore, our developed method achieves the performance on par with the state-of-the-art performance while outperforming other existing methods.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00