Homework 5
Note: the notebook for the lecture has been extended with more notes and a more verbose implementation of the pruning masks for additional clarity. Please pull the repo to get these updates.
Note2: due to the limited time during yesterday's lab, I will be repeating the theoretical background behind pruning on Tuesday 13, after the student's presentation. As a consequence of that, the due date for this homework will be one week away from Tuesday (April 20 at midnight). Moreover, if you have questions concerning yesterday's lab, please prepare them for next Tuesday.
Starting from the implementation contained within the notebook `05-pruning.ipynb`, extend the `magnitude_pruning` function to allow for incremental (iterative) pruning. In the current case, if you try pruning one more time, you'll notice that it will not work as there's no way to communicate to the future calls of `magnitude_pruning` to ignore the parameters which have already been pruned. Find a way to enhance the routine s.t. it can effectively prune networks in a sequential fashion (i.e., if we passed an MLP already pruned of 20% of its parameters, we want to prune *another* 20% of parameters).
Hint: make use the mask.