Unpaired style transfer using CycleGAN
Paired style transfer is a powerful setup with a number of use cases, some of which we discussed in the previous section. It provides the ability to perform cross-domain transfer given a pair of source and target domain datasets. The pix2pix setup also showcased the power of GANs to understand and learn the required loss functions without the need for manually specifying them.
While being a huge improvement over hand-crafted loss functions and previous works, paired style transfer is limited by the availability of paired datasets. Paired style transfer requires the input and output images to be structurally the same even though the domains are different (aerial to map, labels to scene, and so on). In this section, we will focus on an improved style transfer architecture called CycleGAN.
CycleGAN improves upon paired style transfer architecture by relaxing the constraints on input and output images. CycleGAN explores the unpaired style...