Course information

Instructor: Jonathan Niles-Weed (jnw@cims.nyu.edu), Office hours: Monday 1:00-2:00 PM, CIWW 1111

Teaching Assistant: Shuyu Liu (sl7695@nyu.edu), Office hours: Thursday 4:45-5:45 PM, CIWW 1106

Lecture

Tuesday 12:00-1:40 PM, GCASL 361

Recitation

Thursday 3:45-4:35 PM, KIMM 803

Piazza

For announcements and questions, please sign up on Piazza.

Description

The goal of this course is to develop mathematical tools for analyzing statistical procedures.

Prerequisites

Probability, linear algebra, mathematical maturity (comfort with proofs)

Books

While there is no required textbook, a portion of the material for this class will be drawn from All of Statistics by Larry Wasserman. Lecture notes will be available for each week’s lecture.

Though this course will be mathematical, several tricky issues will be ignored. A more rigorous treatment of some of the topics we cover is available in:

All three books are available for free via the links above with NYU credentials.

Lectures

We will follow the following approximate schedule.

Unit 1: Asymptotics and non-asymptotics

  • Week 1: Concentration inequalities
  • Week 2: Maximal inequalities and uniform convergence
  • Week 3: Asymptotics

Unit 2: Classical statistical tasks

  • Week 4: Statistical models and sufficiency
  • Week 5: Estimation
  • Week 6: Testing and confidence sets
  • Week 7: Midterm exam
  • Week 8: Linear and logistic regression

Unit 3: Applications and extension

  • Week 9: Model selection and regularization
  • Week 10: Non-parametric statistics
  • Week 11: Monte Carlo methods
  • Week 12: Causal inference
  • Week 13: Bayesian statistics
  • Week 14: High-dimensional regression

Homeworks

There will be approximately 10 homework assignments over the course of the semester, drawn from the list of exercises at the end of each chapter of the lecture notes. Homeworks are due Mondays at 11:59 pm Eastern time via Gradescope (entry code: YDEZR5).

Late homeworks will not be accepted, but each student may request one, 24-hour homework extension over the course of the semester, with no excuse necessary. (Contact the TA.) Further requests will not be considered. The lowest homework score will be dropped.

You may work with other students, however you must a) write solutions to the homework yourself and b) list the names of the students you collaborated with. If you consult any other sources (printed or online), you must cite those in your homework as well. Any violation of these policies will be considered cheating.

  • HW 1 (due 9/11): Chapter 1, Exercises 1-5
  • HW 2 (due 9/18): Chapter 2, Exercises 1-4
  • HW 3 (due 9/25): Chapter 3, Exercises 1-6
  • HW 4 (due 10/2): Chapter 4, Exercises 1-5
  • HW 5 (due 10/16): Chapter 5, Exercises 1-4
  • HW 6 (due 11/6): Chapter 7, Exercises 1-5
  • HW 7 (due 11/13): Chapter 8, Exercises 1-4
  • HW 8 (due 11/20): Chapter 9, Exercises 1-3
  • HW 9 (due 11/27): Chapter 10, Exercises 1-3
  • HW 10 (due 12/11): Chapter 11, Exercises 1-3

Grading

40% Homework + 30% Midterm + 30% Final project

Exam

There will be an in-class midterm exam on October 24.

Final Project

In lieu of a final exam, this course will have a final project involving reading and summarizing a recent paper (or papers) of statistical interest. Working in groups of at most 2, your job is to: a) summarize the main idea/question of the paper, connecting it to ideas we’ve discussed in this course and b) carefully explain one part of the paper (by giving full details for one of the proofs, recreating an experiment, etc.)

Write-ups should be 5-10 pages, and are due on the last day of the semester (Dec 15).

You may choose any paper you wish, but you may wish to discuss it with me first to make sure it is of sufficient quality. The following example papers may be a good place to start:

  • “Convexity, Classification, and Risk Bounds” (Bartlett, Jordan, McAuliffe; JASA 2006)
  • “Robust Estimation of a Location Parameter” (Huber; Annals of Stats 1964)
  • “Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming” (Belloni, Chernozhukov, Wang; Biometrika 2011)
  • “Universal inference” (Wasserman, Ramdas, Balakrishnan; PNAS 2020)
  • “Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing” (Benjamini, Hochberg; JRSS-B 1995)
  • “Controlling the false discovery rate via knockoffs” (Barber, Candes; Annas of Stats 2015)
  • “Understanding Black-box Predictions via Influence Functions” (Koh, Liang; ICML 2017)
  • “Inference for Empirical Wasserstein Distances on Finite Spaces” (Sommerfeld, Munk; JRSS-B 2018)
  • “Estimation with Quadratic Loss” (James, Stein; Berkeley Symposium 1961)
  • “Reconciling modern machine-learning practice and the classical bias–variance trade-off” (Belkin et al.; PNAS 2019)