Skip to main content

OPTIMIZING QUANTUM FEDERATED LEARNING BASED ON FEDERATED QUANTUM NATURAL GRADIENT DESCENT

Jun Qi (Georgia Institute of Technology ); Zhang XiaoLei (Northwestern Polytechnical University); Javier Tejedor (Institute of Technology, Universidad San Pablo-CEU, CEU Universities)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Quantum federated learning (QFL) is a quantum extension of the classical federated learning model across multiple local quantum devices. A more efficient optimization algorithm is always expected to minimize the communication overhead among different quantum participants. In this work, we propose an efficient optimization algorithm, namely federated quantum natural gradient descent (FQNGD), and further, apply it to a QFL framework that is composed of the variational quantum circuit (VQC)-based quantum neural networks (QNN). Compared with stochastic gradient descent methods like Adam and Adagrad, the FQNGD algorithm admits much fewer training iterations for the QFL to converge. Moreover, it can significantly reduce the total communication overhead among local quantum devices. Our experiments on a handwritten digit classification dataset justify the effectiveness of the FQNGD for the QFL framework in terms of a faster convergence rate on the training set and higher accuracy on the test set.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00