Course plan
Activity 0: Asynchronous learning
- Activity 0a
- Optional: Quiz: Background and a bit
Activity 1: Lecture 1 (only prerecorded due to illness)
- Reading: block1.pdf: pg 1-3
- Contents: Course intro, Low rank approximation applications (image processing, machine learning). Matrix factorizations. QR-factorization.
- Classroom recording: https://kth.instructure.com/courses/31947/discussion_topics/223245
Activity 2: Asynchronous learning
- Reading: block1.pdf: §1.2
- Activity 2a QR-factorization from the Gram-Schmidt orthogonalization
- Activity 2b: Solving linear least squares via QR-factorization
Activity 3: Lecture
- Reading: block1.pdf: §1.2-§1.3
- Contents: QR-factorization variants / examples, Index vectors, CPQR, Low-rank approximation via partial QR
- (Cancelled: Activity 3a or Activity 3a - alternative task)
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/224444
Activity 4: Asynchronous learning
- Reading: block1.pdf: §1.4-§1.4.3
- Activity 4a: SVD intro and properties
- Activity 4b: Eckart-Young therem
- Optional: An introduction to the singular value decomposition by Steve Brunton Links to an external site.
- Optional: The data matrix and SVD by Steve Brunton Links to an external site.
- Optional: SVD and the netflix challange by Steve Brunton Links to an external site.
Activity 5: Lecture 2
- Reading: block1.pdf: §1.3.4,§1.5,§1.6
- Contents: SVD by partial QR. Numerical rank. Randomized SVD (part 1).
- Google forms quizzes: https://forms.gle/eRRBfsj97J44dukX8 Links to an external site. https://forms.gle/eRRBfsj97J44dukX8 Links to an external site.
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/225406
- Optional: An introduction to random SVD by Steve Brunton Links to an external site.
Activity 6: Asynchronous learning
- Reading: block1.pdf: §1.6, §1.3.2
- Activity 6a: Index vectors
- Activity 6b: Randomized SVD by PGM
- Optional: Watch more of the talk Randomized SVD in a research talk by PGM Links to an external site. most important: time-point 01:59-19:50.
- Optional: Enthusiastic talk by Madeleine Udele about the worst data set ever Links to an external site.
Activity 7: Lecture 3
- Reading: block1.pdf: §1.6
- Contents: Randomized SVD (part 2), Applications.
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/226126
- Recommended exercises: AL20: 1-13, 1-41, 1-69, AL19: 1-13
Activity 8: Asynchronous learning
- Reading: block1.pdf: §1.7.1
- Activity 8a: ID introduction
Activity 9: Lecture 4
- Reading: block1.pdf: §1.7
- Contents: Interpolatory decomposition, CUR
- Recommended exercises: AL20: 1-45, 1-55, 1-54, AL19: 1-16
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/227043
- Optional: CUR introduction (video + quiz) (Not mandatory)
- Optional: A talk by PGM on containing ID (mostly focused on PDE-applications) Links to an external site.
Activity 10: Nothing. (Recommend: Work on homework.)
Activity 11: Lecture 5
- Reading: block2.pdf
Download block2.pdf §2 - §2.2
- Contents: Intro to clustering and K-means. Image processing applications. Plastic collection clustering application. K-means. Similarity graphs.
- Deadline for homework 1
- Zoom classroom recording: https://canvas.kth.se/courses/31947/discussion_topics/228426
- Recommended exercises: AL20: 2-2, 2-55, 2-6, AL19: 2-1, 2-18
- Optional: Alternative explanation of K-means
Links to an external site. (youtube)
- Optional: K-means example with indicator vector notation.
Activity 12: Asynchronous learning
- Reading: block2.pdf
Download block2.pdf §2.4.1
- Activity 12c: Algebraic and geometric multiplicity (for spectral clustering)
Activity 13: Lecture 6
- Reading: block2.pdf
Download block2.pdf §2.3
- Contents: Intro to graphs and data on graphs. Connectedness. Subgraph connectedness. Unnormalized laplacian. Eigenvectors of the Laplacian and indicator vectors
- Classroom recording: https://canvas.kth.se/courses/31947/discussion_topics/229246
- Optional: Lecture 32 in an AI course at Stanford corresponding to graph theory basics Links to an external site.
- Recommended exercises: AL20 2-14, 2-21, 2-19, 2-27, 2-28, AL19: 2-2
Activity 14: Asynchronous learning
- Reading: block2.pdf
Download block2.pdf §2.4.2:Rayleigh-Ritz theorem, for algebraic/geometric multiplicity see this wikipedia page.
Links to an external site.
- Activity 14a: Rayleigh-Ritz theorem
- Activity 14c: Mincut
- Recommended exercises: AL22: 2-3
Activity 15: Lecture 7
- Reading: block2.pdf
Download block2.pdf §2.3
- Contents: Unnormalized Laplacian spectral clustering. K-means in spectral clustering. RatioCUT interpretation of spectral clustering
- Recommended exercises: AL19: 2-19, 2-8, AL20: 2-7, 2-19, 2-21, 2-42, 2-66, AL22: 2-37
- Classroom recording: https://canvas.kth.se/courses/31947/discussion_topics/229881
Activity 16: Asynchronous learning
- Reading: block2.pdf
Download block2.pdf §2.5
- Contents: Eigenvalue continuity / differentiability
- Activity 16a: Eigenvalue derivatives
- Recommended exercises: AL21 2-25
Activity 17: Lecture 8
- Reading: block2.pdf
Download block2.pdf pg §2.4-2.5 + UvL
- Contents: RatioCut (k>2). Perturbation viewpoint. Normalized Laplacians. NCut.
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/230446
- Recommended exercises: AL19:2-5, 2-20, 2-17, AL20: 2-27, 2-45, AL22: 2-45
Activity 18: Asynchronous learning
- Reading: block2.pdf Download block2.pdf §2.4.2:Rayleigh-Ritz theorem
- Activity 18a: Proof of the Rayleigh-Ritz theorem
Activity 19: Lecture 9
- Reading: block3.pdf
Download block3.pdf: §3.1
- Contents: Introduction to applications in signal processing. FFT, Cooley-Tukey,
- Recommended exercises: AL20: 3-2, 3-3, 3-7, 3-11, AL21: 3-1, 3-2, 3-8
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/231239
- Deadline for homework 2
Activity 20: Asynchronous learning
- Reading: block3.pdf
Download block3.pdf: §3.2
- Activity 20a: How do Toeplitz matrices arise from convolution?
- Optional: A youtube summary of Fast Algorithms for Convolutional Neural Networks by Andrew Lavin and Scott Gray Links to an external site. giving an example how FFT (Winograd version) is used in deep learning, in particular convolutional neural networks.
Activity 21: Lecture 10
- Reading: block3.pdf
Download block3.pdf: §3.4
- Contents: Toeplitz-like matrices. Action and inverse action for circulant matrices via diagonalization and FFT
- Recommended exercises: AL19: 3-6, AL20: 3-21, 3-15, AL21: 3-19, 3-30
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/231567
Activity 22: Asynchronous learning
- Reading: block3.pdf
Download block3.pdf: last part of §3.4
- Activity 22a: Extending a Toeplitz matrix to a circulant matrix
- Activity 22b: Intro to Levinson, Durbin, Trench
- Recommended exercises: AL19: 3-9, 3-10, AL20: 3-25
Activity 23: Lecture 11
- Reading: block3.pdf
Download block3.pdf: §3.3
- Contents: Durbin + Levinson-Durbin cont
- Recommended exercises AL19: 3-5, 3-11, AL20: 3-9, AL21: 3-14, 3-17
- Zoom recording: https://canvas.kth.se/courses/31947/discussion_topics/232663
Activity 24: Asynchronous learning
- Reading: block3.pdf
Download block3.pdf: §3.5
- Activity 24a: HSS matrices (and some software)
- Recommended exercises: AL22: 3-14a
Activity 25: Lecture 12
- Reading: block3.pdf
Download block3.pdf: §3.3 + §3.5
- Contents: Trench + intro to Semiseperable matrices
- Recommended exercises: AL20: 3-22, AL22: 3-22
- Video: https://canvas.kth.se/courses/31947/discussion_topics/232828
Activity 26: Asynchronous learning
- None.
Activity 27: Lecture 13
- Reading: block3.pdf
Download block3.pdf: §3.5
- Contents: semiseparable + HODLR matrix algorithm / reserve lecture
- Classroom recording: https://canvas.kth.se/courses/31947/discussion_topics/233934
- Recommended exercises: AL20: 3-56, 3-70
- Deadline for homework 3 (slightly postponed)
Activity 28: Asynchronous learning
- Summary (video): https://canvas.kth.se/courses/31947/discussion_topics/233934
Abbreviations:
t.b.a=to be announced (will be added later)
AL22 = http://dhcp-41-104.math.kth.se/~jarl/active_learning/active_learning.php?id=6
AL21 = http://dhcp-41-104.math.kth.se/~jarl/active_learning/active_learning.php?id=2&contents=3
AL20 = Selected problems from Active learning workspace 2020 Download Selected problems from Active learning workspace 2020
AL19 = Selected problems from Active learning workspace 2019