Course Plan
Course literature and supplementary reading
- D. Montgomery, E. Peck, G. Vining: Introduction to Linear Regression Analysis. Wiley-Interscience, (6th Edition (2021). ISBN-10: 978-1-119-57872-7. 704 pages or 5th Edition (2012). ISBN-10: 978-1-118-62736-5. 645 pages). Acronym below: MPV (for the 5th edition).
The textbook MPV can be bought at THS Kårbokhandel, Drottning Kristinas väg 15-19. The book also has a solutions manual. MPV is digitally available via KTHB.
There are a number of other books that cover the topics of the course, and which we will use during the course. Here are some recommendations, which are all available freely online:
- G. James, D. Witten, T. Hastie, R. Tibshirani: An introduction to Statistical Learning Links to an external site. by the publisher Springer.
- A. J. Izenman: Modern Multivariate Statistical Techniques. Regression, Classification, and Manifold Learning Links to an external site. by the publisher Springer. Acronym below: Iz.
- T. Hastie, R. Tibshirani, J. Friedman: The Elements of Statistical Learning Links to an external site.. Springer, 2ed Edition, 2017. Acronym below: HTF.
- T. Hastie, R. Tibshirani, M Wainwright: Statistical Learning with Sparsity: The Lasso and Generalizations Links to an external site. by the publisher Chapman and Hall Books, 2016. Acronym below: HTW.
- Practical regression with R Links to an external site. by Julian R. Faraway (2002) with R-code. Links to an external site.
Preliminary plan of lectures and exercises sessions
- The order of lectures and exercise sessions is subject to change.
- Lecturers and guest lecturers: Timo Koski (TK), teaching assistants (TA), guest lecturers from If P&C Insurance (If), Mattias Villani (MV).
- Problems to be solved during the exercise sessions and recommended exercises to be solved on your own are found here.
Day | Date | Time | Hall | Topic | Lecturer |
---|---|---|---|---|---|
1. Tue | 16/1 | 15-17 | F1 |
Lecture 1: Introduction to Course Work. Simple Linear Regression, Conditional Expectation, MPV parts of Ch. 2 or pp 12-22, p. 26 MPV6th pp. 12-22 |
TK |
2. Thu | 18/1 | 13-15 | E1 |
Lecture 2: Centering matrix, idempotent matrices and other linear algebra, random vectors, expectation of random vectors, covariance matrix, multivariate normal distribution. |
TK |
3. Fri | 19/1 | 8-10 | E1 | Exercise 1 | TA |
4.Mon | 22/1 | 08-10 | E1 |
Lecture 3: Multiple Linear regression Part 1. Least Squares Estimate (LSE) Projection geometry of LSE, MLE MPV Ch. 3 pp. 67-83 MPV6th pp. 69-86 |
TK |
5. Tue | 23/1 | 13-15 | E1 |
Lecture 4: Multiple Regression Part 2. Gauss-Markov Theorem MVP pp. 587-588, MVP6th pp. 615-615 Prediction of new y MVP pp.33-34 p. 104 , MVP6th 106 Quadratic Forms p. MVP 580-584 MVP6th 608-612 Fundamental Variance Decomposition, Distribution of LSE Residuals |
TK |
6. Thu | 25/1 |
10-12 |
E1 |
Lecture 5: Confidence Intervals for Multiple Regression, F -test for model MPV Sections 2.3-2.4, 3.3-3.4 MPV6th Sections 2.3-2.4 3.3-3.4 |
TK |
7. Fri | 26/1 | 08-10 | E1 | Exercise 2 | TA |
8.Mon | 29/1 | 08-10 | E1 |
Lecture 6: Model Diagnostics & Residual analysis, MPV pp. 211-219 MPV pp. 133-135 pp. 151-152 MPV6th 217-225 , MPV6th 139-144 MPV6th 156-157
|
TK |
9. Tue | 30/01 | 13-15 | M1 | Exercise 3 | TA |
10. Thu | 1/2 | 13-15 | F2 |
Lecture 7: Further Confidence Intervals using the centered model MPV pp. 84-85, pp. 581-582 MPV6th 86-87 pp. 609-611
|
TK |
11. Fri | 2/2 | 08-10 | D1 |
Exercise 4 |
IR |
12. Wed | 7/2 | 13-15 | E1 |
Lecture 8: Model Selection by F-test, Variable Selection. Model selection by Akaike Information Criterion (AIC) MPV Ch. 10 pp. 327-328 pp. 334-337 MPV6th Ch 10 pp.342-343 pp. 349-352
|
TK |
13. Thu | 8/2 | 13-15 | Q1 |
Lecture 9: Logistic Regression MPV Ch. 13.2 MPV 6th Ch. 13.2
|
TK |
14. Fri | 9/2 | 08-10 | D1 |
Lecture 10:
1. Generalized inverses 2. Bias-Variance Trade Off, High dimensional Data, Lasso & Ridge regression
MPV pp. 304 -314 MPV6th pp. 312-320 |
TK |
15. Mon | 12/2 |
08-10 |
D1 |
Lecture 11:
1. Gradient Descent Lecture Notes on Canvas 2. Big data: reduction of number of predictors by Principal Component Regression MPV pp. 313-315 MPV6th p.329-330
|
TK |
16. Wed | 14/2 | 13-15 | E1 |
Lecture 12: 1. Linear Regression and Causal Inference 2. Bayesian Learning |
TK |
17. Thu | 15/2 | 13-15 | Q1 |
Lecture 13: Bayesian Regression |
MV |
16. Fri | 16/2 | 08-10 | D1 |
Exercise 5 |
TA |
19.Mon | 19/2 | 08-10 | F1 |
Lecture 14 |
If |
20. Tue | 20/2 | 10-12 | Q1 | Lecture 15 | If |
21. Fri | 23/2 | 08-10 |
D1 |
Exercise 6 | If |
22. Mon | 26/2 | 08-10 | E1 | Exercise 7 | TA |
23.Wed | 28/2 | 13-15 | D1 | Lecture 16 |
If |
24.Thu | 29/2 | 10-12 | E1 | Exercise | If |
Mon | 11/3 | 08-13 | Exam | ||
Tue | 04/06 | 08 -13 | Re-exam |