The course starts by introducing you to the main concepts in Neural Networks (NN) and how do they work. Then we will implement a NN from scratch using Pytorch. After that, a quick introduction to Federated Learning architecture. Then, we will start by loading the dataset on the devices in IID, non-IID, and non-IID and unbalanced settings followed by a quick tutorial on PySyft to show you how to send and receive the models and the datasets between the clients and the server.
This course will teach you Federated Learning (FL) by looking at the original papers' techniques and algorithms then implement them line by line. In particular, we will implement FedAvg, FedSGD, FedProx, and FedDANE. You will learn about Differential Privacy (DP) and how to add it to FL, then we will implement FedAvg using DP. In this course, you will learn how to implement FL techniques locally and on the cloud. For the cloud setting, we will use Google Cloud Platform to create and configure all the instances that we will use in our experiments. By the end of this course, you will be able to implement different FL techniques and even build your own optimizer and technique. You will be able to run your experiments locally and on the cloud.