'CL-FML: Cluster-based & Label-aware Federated Meta-Learning for On-Demand Classification Tasks'
Published: 28 May 2024
Our Distributed AI paper 'CL-FML: Cluster-based & Label-aware Federated Meta-Learning for On-Demand Classification Tasks' authored by Aladwani, T., Anagnostopoulos, C. , Puthiya Parambath, S. and Deligianni, F has been accepted in the 1th IEEE International Conference on Data Science and Advanced Analytics (DSAA 2024), San Diego, CA, United States, 6-10 October 2024. Keywords: Federated Learning, Meta-Learning, Clustering, Data Augmentation.
Distributed analytics involving classification tasks demand robust model training. Real-time arbitrary classification tasks on distributed clients pose challenges due to constraints
in data sharing. Federated (Meta)-Learning (FML) has been introduced for global distributed (meta)-model training, which generalizes well over distributed data and classification tasks.
Current FML approaches assume fixed labels over unskewed class proportions and data distributions along with uniform task distributions. However, global meta-models can only be
used for tasks that do not require addressing arbitrary out-ofdistribution label issues. In real-world cases, class imbalance and label shifting are common issues in clients’ data. On-demand
tasks arriving at clients involve unseen labels. Therefore, ‘one (meta)-model-fits-all’ is not the best option. To address these challenges, we introduce multiple cluster-based meta-models,
each one tailored to specific label distribution. Our framework, coined Cluster-based & Label-aware Federated Meta-Learning (CL-FML), involves distributed client clustering based on label
shifting and cluster-based FML identifying the most suitable clients to engage per task. CL-FML leverages lightweight data augmentation to deal with arbitrary class-imbalanced tasks.
Our comprehensive experiments and comparative assessment against baselines showcase that CL-FML efficiently achieves high accuracy by fast convergence, significantly reducing training
rounds and communication load.
First published: 28 May 2024
Link: paper