Skip to main content

Collaborative Inference Via Ensembles On The Edge

Nir Shlezinger, Erez Farhan, Hai Morgenstern, Yonina C. Eldar

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:15:14
11 Jun 2021

The success of deep neural networks (DNNs) as an enabler of artificial intelligence (AI) is heavily dependent on high computational resources. The increasing demands for accessible and personalized AI give rise to the need to operate DNNs on edge devices such as smartphones, sensors, and autonomous cars, whose computational powers are limited. Here we propose a framework for facilitating the application of DNNs on the edge in a manner which allows multiple users to collaborate during inference in order to improve their prediction accuracy. Our mechanism, referred to as edge ensembles, is based on having diverse predictors at each device, which can form a deep ensemble during inference. We analyze the latency induced in this collaborative inference approach, showing that the ability to improve performance via collaboration comes at the cost of a minor additional delay. Our experimental results demonstrate that collaborative inference via edge ensembles equipped with compact DNNs substantially improves the accuracy over having each user infer locally, and can outperform using a single centralized DNN larger than all the networks in the ensemble together.

Chairs:
Ivan Bajic

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00