Course outline
See this page for help on Anaconda, Jupyter, and Python.
Part One
Linear Regression
Goals
References
Gradient Descent
Goals:
References
- Gradient Descent lab - zip
Probability
Goals:
References
- Probability Notes html pdf
- Naive Bayes Notes html pdf
- Naive Bayes Lab - includes datafiles and ipynb file. zip tgz
Logistic Regression
Goals:
- Understand the statistical model underlying logistic regression
- See how the ideas of likelihood and gradient descent combine to solve the logistic regression problem
- Do some sample computations to see Logistic Regression in action
- Generalize binary logistic regression to multi-class logistic regression
References
- Logistic Regression lab - zip
Principal Component Analysis
Goals:
References
- Principal Component Analysis html pdf
- PCA Lab – includes notebook(s) and data zip tgz
Bayesian Regression
Goals:
- Learn the process of Bayesian inference (see also the notes on probability above).
- Understand over-fitting in linear regression
- Study Bayesian linear regression
- Understand the ideas of linear discriminant analysis
References
- Bayesian Regression Lab – includes notebook(s) and data zip tgz
Support Vector Machines
Goals:
-
Learn the ideas behind support vector machine classifiers
-
Understand the relationship between convex hulls, supporting hyperplanes, and support vector machines
- Formulate the convex optimization problem yielding the optimal margin classifier
- Learn the sequential minimum optimization algorithm
- Further ideas
References:
- Notes on support vector machines html pdf
- Support Vector Machines Lab (includes jupyter notebook and data files) zip tgz
Neural Networks
Goals:
- Learn the ideas of neural networks and understand forward-propagation and back-propagation
- Derive the back-propagation formula
- Learn the mechanisms of basic convolutional neural networks
References
- Neural Networks Lab – includes notebook(s) and data zip tgz