Unsupervised Action Segmentation of Untrimmed Egocentric Videos
Sam Perochon (Ecole Normale Supérieure Paris-Saclay); Laurent Oudre (ENS Paris-Saclay)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
The introduction of affordable wearable cameras and eye trackers have led to a massive amount of egocentric (or first-person view) videos, bringing new challenges to the computer vision community for understanding and leveraging the specificities of the egocentric view. This work proposes a novel approach for unsupervised activity segmentation that detects frames corrupted by ego-motion and estimates action boundaries using kernel change-point detection. The approach leverages the visual characteristics of egocentric videos to improve segments temporal accuracy. We report state-of-the-art performances for unsupervised approaches on two challenging large-scale datasets of untrimmed egocentric videos, EGTEA and EPIC-KITCHEN-55, and on the standard third-person view dataset, 50Salads.