In federated learning (FL), multiple clients collaboratively train a model while ensuring their data remains local and private. However, traditional FL has two severe issues. The first issue is that the privacy promises of traditional FL have been broken where an adversary can launch various types of attacks to either reverse-engineer the client data or infer sensitive properties of such data even when all client data is kept local. The second issue with traditional FL is that its communication overhead is significant, especially with a model of ever-growing size and with a large number of participating clients. Therefore, the task of preserving privacy while ensuring efficient communication and computation is a fundamental challenge in FL.
In this internship, you will learn and explore federated learning, differential privacy, as well as potential systems mechanisms like gradient compression. You will work towards creating an FL system with tunable differential privacy and efficiency guarantees that self-adapts to the user needs and the underlying infrastructure constraints. Ideally, this project will lead to a publication at a top academic venue.
Qualifications
Students enrolled in a PhD program in Computer Science/Engineering. Strong programming skills in Python and ML systems. Experience in designing, implementing and evaluating distributed systems is a big plus. A strong publication record is a big plus. Language skill: English.Duration: flexible, to be agreed (typically 3-4 months), starting time flexible
Location: Stuttgart (Germany)
Tasks
Learn and explore federated learning, differential privacy, as well as potential systems mechanisms like gradient compression. Creating an FL system with tunable differential privacy and efficiency guarantees With the Nokia Bell Labs researchers, prepare a publication at a top academic venue.