Sharing knowledge without sharing data
In this section, we will discuss the basic concepts of federated learning. For traditional distributed DNN training, each user/node can get global access to the whole training dataset. However, in federated learning, each user/node does not get global access to the whole training dataset. More specifically, federated learning enables distributed and collaborative training without sharing the input data.
We will first recap the traditional data parallel training. We will then discuss the main difference between traditional data parallel training and federated learning.
Recapping the traditional data parallel model training paradigm
Let's first look at a simple example of traditional data parallel training using parameter server architecture as shown in the following figure:
As shown in the preceding figure, in a normal data...