FSF3961 VT22 Statistical Inference (61078)

SF3961 Statistical Inference, 15 credits 

Teachers: Henrik Hult, hult@kth.se; Pierre Nyquist, pierren@kth.se; Joakim Andén,
janden@kth.se and Jimmy Olsson, jimmyol@kth.se 

Examiner: Henrik Hult

Course description: The purpose of this course is to cover important topics in the theory of statistics in a thorough and general fashion. The course spans over classical inferential techniques including tests of hypothesis, point estimates, and confidence intervals as well as the Bayesian paradigm where one treats all unknown quantities as random variables and constructs a joint probability distribution for all of them. We also cover machine learning applications based on the theory of statistics. Fundamental concepts are presented from the classical and Bayesian viewpoints in parallel, for better comparison and understanding. Students will practice by studying applications and solving problems related to the theory.

Lectures: All meetings will take place in room 3418 in the Mathematics Department (KTH), Lindstedtsv. 25, 4th floor

1. Introduction - Pierre

  • Key concepts: Classical and Bayesian paradigm, Exponential families, Conjugate priors, Location scale families.
  • Theory meeting: March 11, 13.15-15.00, Room 3721
  • Homework 1 due: March 18

2. Sufficient statistics - Pierre

  • Key concepts: Sufficient statistics, Minimal sufficient statistics, Complete
    statistics, Ancillary statistics
  • Theory meeting: March 18, 13.15-15.00, Room 3721
  • Homework 2 due: March 25

3. Decision theory - Henrik

  • Key concepts: Bayes risk, Minimax theory, Neyman-Pearson's Lemma
  • Theory meeting: March 25, 13.15-15.00, Room 3721
  • Homework 3 due: April 1

4. Decision theory in machine learning - Henrik

  • Key concepts: Decision theory for classification, regression and clustering
  • Theory meeting: April 1, 13.15-15.00, Room 3721
  • Homework 4 due: April 8

5. Point estimation - Joakim

  • Key concepts: Classical and Bayesian point esimation, UMVUE, Cramér-
    Rao, Rao-Blackwell
  • Theory meeting: April 8, 13.15-15.00, Room 3721
  • Homework 5 due: April 22

6. Hypothesis Testing - Joakim

  • Key concepts: Hypothesis testing simple and interval hypothesis,
    Unbiased tests, Interval estimation, Location-scale pivotal quantities,
    Credibility intervals
  • Theory meeting: April 22, 13.15-15.00, Room 3721
  • Homework 6 due: May 13

7. Classical-Bayesian Discussion - Pierre

  • Discussion meeting: May 6, 13.15-15.00, Room 3721
  • Homework 7 due: May 6!!!

8. Computational Methods

  • Key concepts: EM-algorithm, Bootstrap, MCMC
  • Theory meeting: May 13, 13.15-15.00, Room 3721
  • Homework 8 due: May 20

9. Reproducing kernel Hilbert spaces

  • Key concepts: Kernels, RKHS, Representation theorem, Mercer's theorem
  • Theory meeting: May 20, 13.15-15.00, Room 3721
  • Homework 9 due: June 1

Prerequisites: A minimal requirement is a basic course in statistics such as SF1901 and an advanced level course in probability (SF2940), but a graduate course in probability (SF3940) and teaching experience in statistics is recommended.

Format: The course will consist of two-week cycles with one theory lecture (45 min) the first week and one homework presentation meeting (90 min) the second week.

Literature: The course is based on lecture notes, some additional material, and the books

G. Casella and R. Berger: Statistical Inference, 2nd Ed, Duxbury, 2001.

M. Schervish, Theory of Statistics, Springer, 1995.

Aim: After completing the course students are expected to

  • explain the classical and Bayesian paradigms and contrast the two
  • have a good understanding of sufficient statistics and related concepts
  • outline the foundations of statistical decision theory, both classical and Bayesian
  • relate concepts in statistical decision theory to machine learning
  • explain the notion of point estimation, the Cramér-Rao lower bound and the Rao-Blackwell theorem
  • explain the main results and applications of hypothesis testing
  • have thorough knowledge of computational methods in statistics, such as the EM-algorithm, the Bootstrap, and Markov Chain Monte Carlo
  • explain the concept of reproducing kernel Hilbert spaces and their use in statistics
  • be able to solve problems and discuss research related questions, related to the theory

Examination: The examination will be done as a combination of homework and oral exam.