Dynamic API proposal to investigate and improve federated learning scalability

University essay from Uppsala universitet/Institutionen för informationsteknologi

Author: Gergely Módi; [2023]

Keywords: ;

Abstract: Federated learning is an inherently distributed approach to machine learning since the data and the learning process stay on the client's devices. Even though it may suggest that the whole process can be fully distributed, that is not a trivial problem to solve since a global consensus model needs to be maintained and redistributed to its peers. An easily provable sound solution is having a central reducer node that combines all the updated local models and redistributes it to the model provider nodes. This solution can have scalability limitations since the central node needs to maintain a significant amount of connection based on the use case. In this thesis, a reducing API is proposed which can have better scaling properties in case of a high amount of learner nodes. FEDn framework was used to showcase the performance of the proposed solution and to test it in real-world settings.The API can provide comparable performance to the framework's regular base implementation while offering more predictable results in the event of network failures.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)