Section outline

  • We already used RNNs to generate text: this is an example of generative model, specifically an autoregressive generative model.

    In this Part we will introduce generative models, and in particular some families

    • Autoregressive models: RNNs for generation

    • Latent variables models: Variational Autoencoders (VAE)

    • Generative Adversarial Networks (GANs)

    In the second half of this Part we will then shift our attention to the fully attentional model Transformer, due to its importance for modern applications. 

    We will explain its modules in detail, with a full review of the original paper, and we will implement it in PyTorch as a guided exercise, and make it work on a simple problem.  

    This will prepare us for the final guest lecture in which the modern applications of these models will be illustrated in sequence to sequence problems.