Section outline

  • Part I (roughly the first month) is devoted to the foundations of Deep Learning.

    We will start from the basic, covering mainly


    • Fully connected architectures

    • The mechanics of training

    • Regularization

    • Optimization


    We will develop also practical skills that will allow you to deepen your knowledge of FC networks and, furthermore:


    • Monitor learning dynamics: accuracy, loss, parameters and their gradients

    • Create custom layers and loss functions

    • Modify learning rules 

      • as an effect of introducing regularization

      • explicitly constraining the dynamics on particular regions of the parameter space, e.g. on low-dimensional hyperplanes

      • masking a chosen subset of parameters

    • Extract representations in hidden layers for further study

    • Study basic aspects of representations including their PCA decomposition and linear decoding of latent features



    We will conclude the first part with a guided exercise on network pruning by masking.


    Please refer to MS Teams for the recordings of the lectures