Consultation hours

Current courses

Send me an email to schedule an appointment online.
 
Topics in Learning Theory
Lecture 15: 
Stochastic bandits [pdf]
 
Lecture 14: 
First order methods for online convex optimization
Notes: [pdf]
Complement: The online Newton step algorithm
Notes: [pdf] 
 
Lecture 13: 
The exponentially weighted average (EWA) forecaster
Notes: [pdf]
Lecture 12: 
Introduction to online learning
Notes: [pdf]

 

Lecture 11: 
Stochastic optimization
Notes: [pdf]
Lectures 9 and 10: 
Mirror descent 

Notes: [pdf]

Complement: Pinsker's inequality

Notes: [pdf]

Lecture 8: 
Gradient descent for smooth and strongly convex functions; Accelerated gradient descent.
Notes: [pdf]
Lecture 7:
Gradient descent for convex functions.
Notes: [pdf]
Lecture 6:
Introduction to convex optimization.
Notes: [pdf]
Lecture 5:
Convex approach to binary classification.
Lecture 4:
Rademacher complexity and VC dimension.
Lecture 3:
Introduction to empirical risk minimization (ERM); Estimation-Approximation tradeoff; ERM with a finite class and a bounded loss; Noiseless case.
Lecture 2:
Conditional probabilities and expectation; Optimal predictors; Examples of the square and binary losses.
Lecture 1:
Introduction to supervised learning; Learning sample; Loss functions, Risk and Excess risk.