Consultation hours
Send me an email to schedule an appointment online.
Course
Modern Methods of Decision Making
Master Program

Data Science
Meetings

Mondays, from 11:10 to 14:20, from January 11 to June 14, 2021
Format

Online

Zoom link: here
Main references for the course:

S. ShalevShwartz and S. BenDavid (2014). Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press.

S. Bubeck (2015). Convex Optimization: Algorithms and Complexity. Foundations and Trends in Machine Learning. Vol. 8, No. 34.
Home assignments

Home assignment 1: here (due February 26)

Home assignment 2: coming soon
Previously:

Video recordings: here

Lecture 0: Introduction (January 11)

Lecture 1: Statistical vs. online learning (January 18).

Notes: here


Lecture 2: Tools from probability theory (January 25)

Lecture 3: Recap on linear algebra and differential calculus (February 1st)

Lecture 4: Convexity (February 8)

Lecture 5: Empirical risk minimisation I (Feb 15)

Lecture 6: Empirical risk minimisation II (March 1)

Notes: here

No seminar

Next:

Lecture 7: Support vector machines (March 8)

Lecture 8: Boosting

Lecture 9: Gradient descent

Lecture 10: Mirror descent

Lecture 11: Stochastic optimisation

Lecture 12: Introduction to online learning

Lecture 13: Prediction with expert advice

Lecture 14: First order methods for online convex optimisation

Lecture 15: Online newton step algorithm

Lecture 16: Stochastic bandit algorithms

Lecture 17: Adversarial bandit algorithms

Lecture 18: Bandit convex optimisation
Course
High dimensional probability and statistics
Master Program

Statistical Learning Theory
Meetings

Wednesdays, from 13:00 to 16:00, from January 27 to March 17, 2021
Format

Online

Zoom link: here
Main references for the course:

R. Vershynin (2018). HighDimensional Probability. An Introduction with Applications in Data Science. Cambridge University Press.

M. Wainwright (2019). HighDimensional Statistics. A NonAsymptotic Viewpoint. Cambridge University Press.
Home assignments

First: here (due February 24)
Previously:

Video recordings: here

Lecture 1: Concentration (January 27)

Notes: here


Lecture 2: Sums of independent random variables (February 3)

Notes: here


Lectures 3 & 4: Suprema (February 10 & 17)

Notes: here


Lecture 5: The JohnsonLindenstrauss lemma (February 24)

Notes: here


Lecture 6: Cov. matrix estimation and PCA (March 3)

Notes: coming soon

Next

Lecture 7: Concentration of random matrices (March 10)

Lecture 8: Community detection in random graphs (March 17)

Lecture 9: Highdimensional linear regression (March 24)
Course
Gradients flows in metric spaces
PhD Program
Meetings

Thursdays, from 18:10 to 21:00, from January 14 to March 18, 2021
Format

Online

Zoom link: here
References for the course

S. Danieri and G. Savaré (2014). Lecture notes on gradient flows and optimal transport. In Y. Ollivier, H. Pajot, & C. Villani (Eds.), Optimal Transport: Theory and Applications (London Mathematical Society Lecture Note Series, pp. 100144). Cambridge: Cambridge University Press. (available on arxiv here)

L. Ambrosio, N. Gigli and G. Savaré (2005). Gradient Flows. Birkhauser.

L. Ambrosio and N. Gigli (2009). A user’s guide to optimal transport. (available here)
Previously

Lecture 0: Intro

Lecture 1: Review of gradient flows in Euclidean spaces

Notes: here


Lecture 2: Definitions of gradient flows in metric spaces: EDI, EDE and EVI

Notes: here


Lecture 3: Minimizing movement scheme and existence of EDI gradient flows.

Notes: coming soon

Next

Lecture 4