Winter 2021: Statistical Learning Theory

Winter 2021: Statistical Learning Theory

Course No:
EECS 598-008
Credit Hours:
3 credits
Instructor:
Clayton Scott
Prerequisites:
EECS 501 and EECS 545 (advisory)

This course will cover statistical learning theory including the following topics: concentration inequalities, consistency of learning algorithms, Vapnik-Chervonenkis theory and Rademacher complexity, reproducing kernel Hilbert spaces and kernel methods, surrogate losses, and deep learning. Unsupervised and online learning may also be covered as time permits.

Students are expected to have (1) a strong background in probability at the level of EECS 501, (2) prior exposure to machine learning algorithms, such as EECS 545, Stat 601, or Stat 605, and (3) some experience with writing formal mathematical proofs as might be acquired in an upper level undergraduate mathematics course.

Grading will be based on occasional homework assignments and an individual end-of-semester report on a topic of the student’s choosing. There may also be a participation component to the grade. There will be no exams.

One desired outcome for students taking this course is an ability to read research articles in the field of machine learning and appreciate the significance of the theoretical performance guarantees describe in those articles. Students developing new algorithms as part of their research can also expect to learn techniques that will help them analyze their algorithms.

More info (pdf)