Skip to main content

A Quantitative Analysis Of The Robustness Of Neural Networks For Tabular Data

Kavya Gupta, Beatrice Pesquet-Popescu, Fateh Kaakai, Jean-Christophe Pesquet

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:08:07
10 Jun 2021

This paper presents a quantitative approach to demonstrate the robustness of neural networks for tabular data. These data form the backbone of the data structures found in most industrial applications. We analyse the effect of various widely used techniques we encounter in neural network practice, such as regularization of weights, addition of noise to the data, and positivity constraints. This analysis is performed by using three state-of-the-art techniques, which provide mathematical proofs of robustness in terms of Lipschitz constant for feed-forward networks. The experiments are carried out on two prediction tasks and one classification task. Our work brings insights into building robust neural network architectures for safety critical systems that require certification or approval from a competent authority.

Chairs:
Xinmiao Zhang

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00