Lecture Schedule & Material
Below is a summary of the lecture topics and links to the lecture slides. I will try and make all slides available before the lecture begins. We might vary the order of the lecture topics (probability of this happening is larger for the later lectures). The topics of Lectures 1-5 are fairly set though.
I am not recording the lectures however the lectures I recorded in 2021 are available on this page: Videos of DD2424 lectures 2021
Lecture 1
Title: The Deep Learning Revolution
Date & Time: Wednesday, March 22, 10:00-12:00
Location: Alfvénsalen
More details and material
-
Topics covered:
- Review of the impact deep networks have had in the application fields of speech, computer vision and NLP.
- Review of the course's syllabus.
- Review of the course's assignments, project and assessment.
-
Slides: Lecture1.pdf Download Lecture1.pdf, Lecture1_CourseAdmin.pdf Download Lecture1_CourseAdmin.pdf
-
Video of the lecture from 2021:DD2424 2021 videos available on this page
-
Interesting extra reading material:
State of AI Report 2022 by Nathan Benaich and Ian Hogarth available at https://www.stateof.ai/. Document giving an overview of the current start of AI. I mention the document in the lecture.
I got a question about whether DL has been used to help make scientific discoveries. I mumbled something about a new story about it being used by some mathematicians. Here is a link to the story: Machine Learning Becomes a Mathematical Collaborator Links to an external site.
Lecture 2
Title: Learning Linear Binary & Linear Multi-class Classifiers from Labelled Training Data
(mini-batch gradient descent optimization applied to "Loss + Regularization" cost functions)
Date & Time: Thursday, March 23, 13:00-15:00
Location: Alfvénsalen
More details and material
-
Topics covered:
- Supervised learning = minimizing loss + regularization.
- Learning linear classifiers.
- Binary SVM classifiers as an unconstrained optimization problem.
- Gradient Descent, SGD, mini-batch optimization.
- Multi-class classification with one layer networks.
- Different loss-functions.
-
Slides: Lecture2.pdf Download Lecture2.pdf
-
Video of the lecture from 2021: DD2424 2021 videos available on this page
-
Suggested Reading Material: Sections 5.1.4, 5.2, 5.2.2, 5.7.2 from "Deep Learning" by Goodfellow, Bengio and Courville. Link to Chapter 5 of Deep Learning (Links to an external site.)
Sections 8.1.3, 8.3.1 from the book give amore detailed description and analysis of mini-batch gradient descent and SGD than given in the lecture notes. Link to Chapter 8 of Deep Learning (Links to an external site.) .
The suggested readings from chapter 5 should be familiar to those who have already taken courses in ML. Lecture 2 should be more-or-less self-contained. But the reading material should flesh out some of the concepts referred to in the lecture.
During the lecture generalization, over-paramaterized models and double descent came up. Here are some slides I made for another lecture showing the phenomenon: double_descent_slides.pdf
Lecture 3
Title: Back Propagation
Date & Time: Friday, March 24, 10:00-12:00
Location: Alfvénsalen
More details and material
-
Topics covered:
- Chain rule of differentiation.
- Chain rule for vector inputs and outputs
- Computational graph.
- Back propagation (In more detail then you probably ever expected!)
-
Slides: Lecture3.pdf Download Lecture3.pdf
-
Suggested Reading Material:
- Nice review of the back-prop algorithm written by Boaz Barak Yet another backpropagation tutorial
- Overview of PyTorch Autograd Engine - Link to a blog post describing how PyTorch does automatic gradient calculations under the hood - i.e. how it does the Jacobian and gradient matrix multiplications we descriptions in the lecture in an efficient way w.r.t. memory and computation.
Lecture 4
Title: k-layer Neural Networks
Date & Time: Wednesday, March 29, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- k-layer Neural Networks.
- Activation functions.
- Backprop for k-layer neural networks.
- Problem of vanishing and exploding gradients.
- Importance of careful initialization of network's weight parameters.
- Batch normalization + Backprop with Batch normalisation
-
Slides: Lecture4.pdf Download Lecture4.pdf
Lecture 5
Title: Training & Regularization of Neural Networks
Date & Time: Thursday, March 30, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Variations of gradient descent.
- Learning rate schedulers.
- Regularization - Dropout
- Practicalities of training neural networks and hyper-parameter optimization.
- Slides: Lecture5.pdf Download Lecture5.pdf
Lecture 6
Title: All about Convolutional Networks
Date & Time: Tuesday, April 4, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Details of the convolution layer in Convolutional Networks.
- Gradient computations for a convolutional layers.
- Common operations in ConvNets - max-pooling etc
- Slides: Lecture6.pdf Download Lecture6.pdf
- Interesting other material:
- Improving neural networks by preventing co-adaptation of feature detectors Links to an external site.- In the lecture got a question about co-adaptation. This linked blog could help explain the issue.
Lecture 7
Title: Even more about ConvNets
Date & Time: Wednesday, April 5, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Review of the modern top performing deep ConvNets - AlexNet, VGGNet, GoogLeNet, ResNet
- Practicalities of training deep neural networks - data augmentation, transfer learning and stacking convolutional filters.
- Slides: Lecture7.pdf Download Lecture7.pdf
Lecture 8
Title: ConvNets beyond Image Classification
Date & Time: Wednesday, Apr 19, 10:00-12:00
Location: Alfvénsalen
Part 1 - Will be a normal lecture finishing up material from Lecture 7 and then introducing semantic segmentation and Transposed Convolutions etc.
Part 2 (last ~20 minutes of the lecture) - I will answer questions about the group project. Great if you could add your questions to: Google Document for questions about the group project
Links to an external site.before the lecture.
More details and material
- Topics covered:
- Part 1 - The Transposed Convolution operation, Bottleneck ConvNets - Semantic Segmentation
- Slides: Lecture8.pdf
Download Lecture8.pdf
Lecture 9
Title: Networks for Sequential Data: RNNs & LSTMs
Date & Time: Thursday, April 20, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- RNNs
- Back-prop for RNNs
- RNNs for synthesis problems.
- Problem of exploding and vanishing gradients in RNN.
- LSTMs
- Slides: Lecture9.pdf Download Lecture9.pdf
Lecture 10
Title: Translation, Attention, Self-Attention
Date & Time: Wednesday, April 26, 10:00-12:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Neural Machine Translation
- Sequence to Sequence
- Seq2Seq with attention
- Self-attention
- Slides: Lecture10.pdf Download Lecture10.pdf
Lecture 11
Title: Transformer Networks
Date & Time: Thursday, April 27, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Self-Atttention review
- Transformer encoder-decoder
- Pre-training language models for NLP
- Transformers for Computer vision
- Slides: Lecture11.pdf Download Lecture11.pdf
Lecture 12
Title: Self-supervised learning
Date & Time: Tuesday, May 2, 10:00-12:00
Location: Alfvénsalen
More details and material
- Topics covered:
- Motivation for self-supervised learning
- Pre-text tasks
- Contrastive Representative Learning
- SimCLR & MoCo
- Slides:Lecture12.pdf Download Lecture12.pdf
Lecture 13
Title: How to generate realistic images using deep learning?
Date & Time: Wednesday, May 3, 13:00-15:00
Location: Alfvénsalen
More details and material
- Topics covered:
- High-level review of image generation
- High-level review of image generation via diffusion
- Adversarial examples
- Slides: Lecture13.pdf Download Lecture13.pdf