Constrained Dynamical Neural ODE for Time Series Modelling: A Case Study on Continuous Emotion Prediction
Ting Dang (University of Cambridge); Antoni Dimitriadis (University of New South Wales); Jingyao Wu (University of New South Wales); Vidhyasaharan Sethu (University of New South Wales); Eliathamby Ambikairajah (The University of New South Wales)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
A number of machine learning applications involve time series prediction, and in some cases additional information about dynamical constraints on the target time series may be available. For instance, it might be known that the desired quantity cannot change faster than some rate or that the rate is dependent on some known factors. However, incorporating these constraints into deep learning models, such as recurrent neural networks, is not straightforward. In this paper, we propose constrained dynamical neural ordinary differential equation (CD-NODE) models, which treat the desired time series as a dynamic process that can be described by an ODE. CD-NODEs model the rate of change of the time series as a function of both itself and the current input features, parameterised as a neural network. We explore the effect of constraining the dynamics of the model by placing explicit restrictions on the rate of change. The proposed model is evaluated on speech-based continuous emotion prediction, where such dynamical constraints are expected, using the publicly available RECOLA dataset. Results suggest that the model achieves performances comparable with the state-of-the-art despite using significantly fewer parameters. Additional analyses reveal that imposing these constraints on the model leads to faster convergence and better performance, especially with smaller training data sets.