# |
Day |
Topic/Slides |
Reading |
1 | M 8/26 | Introduction | FCML Ch1, ISLR Ch1 |
2 | W 8/28 | The Linear Model and Least Mean Squares |
|
| M 9/2 | Labor Day |
3 | W 9/4 | Higher Dimensions |
|
4 | M 9/9 | Geometry of LLMS, Nonlinear response |
5 | W 9/11 | Cross Validation, model selection, regularization | ISLR Ch 5 |
|
6 | M 9/16 | Probability and Expectation | FCML Ch 2, ISLR Ch 2 |
7 | W 9/18 | More Probability |
|
8 | M 9/23 | Linear Gaussian Model |
9 | W 9/25 | Maximum Likelihood |
|
10 | M 9/30 | Properties of Linear Gaussian Model I |
11 | W 10/2 | Properties of Linear Gaussian Model II |
|
12 | M 10/7 | Introduction to Bayesian Modeling | FCML Ch 3 |
13 | W 10/9 | Priors and Marginal Likelihood |
|
14 | M 10/14 | Bayesian Linear Gaussian Model |
15 | W 10/16 | Marginal Likelihood Model Selection |
|
16 | M 10/21 | Review |
17 | W 10/23 | Midterm Exam |
|
18 | M 10/28 | Logistic Regression | FCML Ch 4, ISLR Ch 4 |
19 | W 10/30 | Estimation I - Gradient Methods |
|
20 | M 11/4 | Estimation II - Laplace Approximation |
21 | W 11/6 | Estimation III - Sampling, Metropolis-Hastings |
|
22 | M 11/11 | Veterans Day |
23 | W 11/13 | Classification - Bayesian Classifier | FCML Ch 5 |
|
24 | M 11/18 | Classification - Nearest Neighbors, Classifier Evaluation |
25 | W 11/20 | Classification - SVMs I - Maximum Margin |
|
26 | M 11/25 | Classification - SVMs II - Kernels |
27 | W 11/27 | Neural Networks I - Perceptron and Backpropagation | TBA |
|
28 | M 12/2 | Neural Networks II - Autoencoders |
29 | W 12/4 | Clustering - Kmeans and Mixture Models | FCML Ch 6 |
|
30 | M 12/9 | Clustering - Gaussian Mixture Model and EM |
31 | W 12/11 | Principle Components Analysis | FCML Ch 7 |
|
| M 12/13 | Final Assignment Due (no in-class exam)
|