CAP 6610: Machine
Learning
Schedule: MWF 7th Period
Location: CSE E122
Texts:
- Required: Pattern
Recognition and Machine Learning, Christopher
M. Bishop, Publisher: Springer, 2007.
- Additional: Statistical
Learning Theory,
Vladimir
N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
- Other Material:
Notes and papers from the research literature.
Instructor:
Prof. Anand Rangarajan, CSE
E352, email: anand@cise.ufl.edu
Teaching Assistants: Hang Guan, CSE E309, email: hguan@cise.ufl.edu, Phone: 352 226 3435
Office hours:
Anand: MWF 8th period or
by
appointment.
Hang: T 8th period and R 8th and 9th periods in CSE E309.
Grading:
- Homeworks: 20%.
- Two Midterms: 20% each.
- Project: 40%.
Notes:
- Prerequisites:
A familiarity with basic concepts in calculus,
linear algebra, and probability theory. A partial list of basic
requirements follows. Calculus: Differentiation, chain rule,
integration. Linear algebra: Vector spaces, inverse,
pseudo-inverse. Probability theory: Conditional probability, Bayes
rule, conditional expectations. A basic understanding of Hilbert spaces (from
Math for Intelligent Systems - COT5615 or equivalent). While AI is listed as a pre-requisite,
if any aspect of AI turns out to be required, it will be taught in
class in order to make the course self-contained.
- Homeworks/programs
will be assigned bi-weekly. If you do not
have any prior numerical computing experience, I suggest you use MATLAB
for the programs.
- The first
midterm will probably be held on Wednesday, February 26th, 2014
(from 8:20-10:10PM)
and the
second
will probably be held on Monday, April 21st, 2014 (from 8:20-10:10PM). Each
midterm will be 1 hour and fifty minutes long.
- The project is due at the end of the semester.
Syllabus
- Introduction to supervised and unsupervised learning.
- Fisher discriminants, linear regression and classification.
- Introduction to convex optimization.
- Kernel methods and support vector machines (SVMs).
- Regression methods and sparse approximations.
- Convexity, maximum likelihood principle.
- Mixture models and Expectation-Maximization (EM) methods in clustering, K-means.
- Component Analysis
(PCA and ICA).
- Introduction to manifold learning (if time permits).