Deep Neural Networks With Flexible Complexity While Training Based On Neural Ordinary Differential Equations
Zhengbo Luo, Sei-ichiro Kamata, Zitang Sun, Weilian Zhou
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:24
Most structures of deep neural networks (DNN) are with a fixed complexity of both computational cost (parameters and FLOPs) and the expressiveness. In this work, we experimentally investigate the effectiveness of using neural ordinary differential equations (NODEs) as a component to provide further depth to relatively shallower networks rather than stacked layers (depth) which achieved improvement with fewer parameters. Moreover, we construct deep neural networks with flexible complexity based on NODEs which enables the system to adjust its complexity while training. The proposed method achieved more parameter-efficient performance than stacking standard DNNs, and it alleviates the defect of the heavy cost required by NODEs.
Chairs:
C.-C. Jay Kuo