FEDRPO: FEDERATED RELAXED PARETO OPTIMIZATION FOR ACOUSTIC EVENT CLASSIFICATION
Meng Feng (MIT); Chieh-Chi Kao (Amazon); Qingming Tang (Amazon, Alexa); Amit Solomon (Amazon); Viktor Rozgic (Amazon Alexa); Chao Wang (Amazon)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Performance and robustness of real-world Acoustic Event Classification (AEC) solutions depend on ability to train on diverse data from wide range of end-point devices and acoustic environments. Federated Learning (FL) provides a framework to leverage annotated and non-annotated AEC data from servers and client devices in a privacy preserving manner. In this work we propose a novel Federated Relaxed Pareto Optimization (FedRPO) method for semi-supervised FL with heterogeneous client data. Opposed to federated averaging class of FL algorithms (fedAvg) that perform unconstrained weighted aggregation across all data sources, FedPRO enables special treatment of data with high quality annotations and data with pseudo-labels of unknown, varying qualities. In particular, FedRPO computes the updates to the global model solving a constrained linear program, with explicit Pareto constraints to prevent performance degradation on annotated data, and controlled relaxation of the Pareto constraints on pseudo-labeled data to prevent learning of patterns in conflict with the annotated data. We show FedRPO significantly outperforms FedAvg on Amazon internal de-identified dataset on AEC tasks for supervised learning and, combined with FixMatch for semi-supervised learning, by relatively 32.5% and 50.5% in precision, while maintaining recall at 90%.