This video will explain how the Self-Attention layer is integrated into the Generative Adversarial Network. This mechanism is powering many of the current state of the arts in GANs and across Deep Learning applications such as Neural Machine Translation.
Paper Link: https://arxiv.org/pdf/1805.08318.pdf
Thanks for watching! Please Subscribe!