-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:14:33
Reliable fault detection of industrial assets is crucial for the success of smart industry. Standard (supervised) data-driven approaches for fault detection perform poorly due to lack of semantic annotations of condition monitoring data, and novel fault types introduced at the operational time. We propose Contrastive Sensor Transformer (CST), a novel approach for learning useful representations for robust fault identification without using task-specific labels. We explore sensor transformations for pre-training in a self-supervised contrastive manner, where the similarity between the original signal instance and its augmented version is maximized. We demonstrate that the powerful transformer architecture applied to condition monitoring data learns highly useful embedding that perform exceptionally well for fault detection in low labeled data regimes and for the identification of novel fault types. Our approach obtains an average of 75% accuracy on the considered bearing benchmark datasets while using less than 2% of the labeled instances.