Skip to main content

Memory-Augmented U-Transformer for Multivariate Time Series Anomaly Detection

Shuxin Qin (Purple Mountain Laboratories); Yongcan Luo (Purple Mountain Laboratories); Gaofeng Tao (Purple Mountain Laboratories)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Unsupervised anomaly detection for multivariate time series is challenging due to the diversity of anomalies and the increased scale of dimensions. Recently, reconstruction-based methods have played an important role and made impressive progress. However, these methods can easily suffer from overfitting or lack of discrimination between normal and abnormal samples. In this work, we propose a memory-augmented U-Transformer framework to address these issues. Specifically, we insert down-sampling and up-sampling layers into the Transformer encoder and decoder separately to improve feature representation. The encoder and decoder are then connected by specially designed memory modules at multiple levels. We exploit the power of the memory modules to record prototypical patterns of normal samples and alleviate unexpected generalization to abnormal ones. Extensive experiments on various public benchmarks demonstrate that our method has achieved state-of-the-art performance.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00