We consider distance functions between conditional distributions. We focus on the Wasserstein metric and its Gaussian case known as the Frechet Inception Distance (FID). We develop conditional versions of these metrics, analyze their relations and provide a closed form solution to the conditional FID (CFID) metric. We numerically compare the metrics in the context of performance evaluation of modern conditional generative models. Our results show the advantages of CFID compared to the classical FID and mean squared error (MSE) measures. In contrast to FID, CFID is useful in identifying failures where realistic outputs which are not related to their inputs are generated. On the other hand, compared to MSE, CFID is useful in identifying failures where a single realistic output is generated even though there is a diverse set of equally probable outputs.
View on arXiv