Projects per year
Abstract
Federated Learning (FL) enables multiple parties to train a model without sharing data. However, in heterogeneous scenarios where the data distribution amongst the FL participants is non-independent and identically distributed (non-IID), FL suffers from the data heterogeneity challenge
which severely degrades the ability of the global model to converge. To solve this problem, we propose a novel data augmentation strategy, named DPSDA-FL, which can aid in homogenizing the local data present on the client’s side. DPSDA-FL improves the training of the global model by
leveraging differentially private synthetic data from foundation models. We obtain promising preliminary results on the CIFAR-10 dataset regarding recall of the global model.
which severely degrades the ability of the global model to converge. To solve this problem, we propose a novel data augmentation strategy, named DPSDA-FL, which can aid in homogenizing the local data present on the client’s side. DPSDA-FL improves the training of the global model by
leveraging differentially private synthetic data from foundation models. We obtain promising preliminary results on the CIFAR-10 dataset regarding recall of the global model.
Original language | English |
---|---|
DOIs | |
Publication status | Published - 3 Jul 2024 |
Fingerprint
Dive into the research topics of 'Improved Federated Learning with Non-IID Data Using Foundation Models'. Together they form a unique fingerprint.Projects
- 1 Active
-
EnnCore: End-to-End Conceptual Guarding of Neural Architectures
Cordeiro, L. (PI), Brown, G. (CoI), Freitas, A. (CoI), Luján, M. (CoI) & Mustafa, M. (CoI)
1/02/21 → 31/12/25
Project: Research