Dynamic Margin for Federated Learning with Imbalanced Data

Abstract

Federated Learning (FL) enables plenty of edge computing devices to jointly learn a model without data sharing. However, in real-world federated datasets, traditional FL algorithms perform poorly with non-IID data. To better learn from non-IID data, we propose DMFL (Dynamic Margin for Federated Learning) as a strategy to improve the performance of federated learning with highly skewed imbalanced dataset. Accordingly, we perform a theoretical analysis for dynamic margin loss in FL, demonstrating that it is applicable to yield a lower generalization error bound. DMFL encourages clients to have a large margin for minority classes by adding a dynamic term into margins. With a proper margin shift on the fixed margin, the imbalance between clients can be alleviated. Our extensive experiments indicate that DMFL outperforms the other baselines on test accuracy. The results also indicate that each client in DMFL not only performs well locally but also has a good generalization on global test data.

Publication
2021 International Joint Conference on Neural Networks (IJCNN)
Date