We are going to use the Jensen-Shannon divergence (JSD)and the Kolgomorov-Smirnov Two-Sample test for comparing real samples and samples generated with GANs. We are going to use the KS Two-Sample test implementation found on scipy.stats and ks_2samp.
Metrics
Jensen-Shannon divergence
As we described in Chapter 2, Introduction to Generative Models, the JSD is a symmetric and smoothed version of the Kullback-Leibler divergence:
The implementation in Python is straightforward. First, we normally each distribution by dividing them by their respective norm such that the comparison is at the same scale. After normalizing the distributions, we compute the KL distance from P to M and Q to M, where M is the mean between the distributions...