Federated Learning (FL) is a paradigm where models are collaboratively trained by sharing only local parameters with a central aggregation server and faces limitations in heterogeneous environments. In particular, the heterogeneity of client data and device capabilities affects model generalization, convergence, and resource management. In this scenario, “client clustering” has emerged as a strategy to mitigate these issues, enabling more efficient model aggregation, improving convergence, and enhancing personalization across diverse data distributions.
Bio: Gabriel Ukstin Talasso holds a Bachelor's degree in Statistics and is currently a Master’s student in Computer Science at the University of Campinas (Unicamp), Brazil. His research focuses on training and fine-tuning language models in distributed environments using federated learning, particularly in scenarios with heterogeneous data.