Introduction to Computational Learning Theory (COMP SCI 639)
This course will focus on developing the core concepts and techniques of computational learning theory. We will examine the inherent abilities and limitations of learning algorithms in well-defined learning models. Specifically, the course will focus on algorithmic problems in supervised learning. The goal of supervised learning is to infer a function from a set of labeled observations. We will study algorithms for learning Boolean functions from labeled examples in a number of models (online learning, PAC learning, SQ learning, learning with noise, etc.).
- Instructor: Ilias Diakonikolas
Office Hours: Tuesday/Thursday, 12-1pm.
- Teaching Assistant: Nikos Zarifis (email@example.com)
Office Hours: Monday/Friday, 11:30-12:30, CS 4384
- Lectures: Tuesday, Thursday 1:00-2:15, COMP SCI 1325.
Mathematical maturity. Background in undergraduate algorithms.
Here is an outline of the course material:
- Online Learning: Winnow, Best Experts, Weighted Majority
- PAC Learning, Relation to Online Learning, Occam's Razor
- VC Dimension and Sample Complexity
- Learning Decision Trees and DNFs
- Learning with Noise: Classification Noise, Malicious Noise
- Statistical Query Learning
- Distribution Dependent Learning, Fourier Transform
- Computational Hardness of Learning
- Learning with Membership and Equivalence Queries
- Other Models of Learning: Semi-supervised Learning, Active Learning
- Lecture 1 (January 21) Introduction to computational learning theory.
- Lecture 2 (January 23) Introduction to Online Learning.
- Lecture 3 (January 28) Online Learning of Disjunctions and Decision Lists.
- Lecture 4 (January 30) Winnow and Perceptron Algorithms.
- Lecture 5 (February 4) More on Winnow and Perceptron. Introduction to VC dimension.
- Lecture 6 (February 6) VC dimension and lower bound on online learning. Weighted Majority Algorithm.
- Lecture 7 (February 11) Analysis of Weighted Majority, Randomized Weighted Majority and Analysis
- Lecture 8 (February 13) Introduction to PAC Learning.
- Lecture 9 (February 18) PAC Learning continued. Learning Intervals
and Rectangles. Reduction from Online Learning to PAC Learning.
- Lecture 10 (February 20) Finding a Consistent Hypothesis and Occam’s Razor.
Cardinality version of Occam’s Razor.
- Lecture 11 (February 25) Greedy Set Cover Heuristic. Using cardinality version
of Occam’s razor to PAC learn sparse disjunctions with near-optimal sample complexity.
- Lecture 12 (February 27) Hypothesis Testing. Basic Concentration Inequalities.
Proper vs Non-proper PAC Learning.
Homework Assignments: There will be 4 homework assignments that will count for 60% of the grade.
The assignments which will be proof-based, and are intended to be challenging.
Collaboration and discussion among students is allowed, though students must write up their solutions independently.
Course Project: A part of the course (25% of the grade) is an independent project
on a topic related to learning theory. Projects can be completed individually or in groups of two students.
The goal of the project is to become an expert in an area related to the class material, and potentially contribute to the state of the art.
There are two aspects to the course project. The first is a literature review: Students must decide on a topic and a list of papers,
understand these papers in depth, and write a survey presentation of the results in their own words.
The second aspect is to identify a research problem/direction on the chosen topic, think about it,
and describe the progress they make.
Students must consult with the instructor during the first half of the course for
help in forming project teams, selecting a suitable project topic, and selecting a suitable set of research papers.
Students will be graded on the project proposal (5%), the progress report (5%), and the final report (15%).
The remaining part of the grade will be based on class participation (15%).
The textbook for this course is:
An additional textbook (available online) we will use is: