CAP 6610: Machine
Learning
Schedule: T 7th Period, R 7-8th
Periods
Location: CSE E107
Texts:
- Required: Pattern
Recognition and Machine Learning, Christopher
M. Bishop, Publisher: Springer, 2007.
- Recommended: Pattern
Classification,
Richard O. Duda, Peter E. Hart
and David G. Stork, Publisher: Wiley Interscience, second edition, 2000.
- Additional: Statistical
Learning Theory,
Vladimir
N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
- Other Material:
Notes and papers from the research literature.
Instructor:
Prof. Anand Rangarajan, CSE
E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu
Office hours:
Anand, T 8-9th Periods and F 7th Period or
by
appointment.
Grading:
- Homeworks: 20%.
- Two Midterms: 30% each.
- Project: 20%.
Homeworks, Projects and other Announcements
Notes:
- Prerequisites:
A familiarity with basic concepts in calculus,
linear algebra, and probability theory. A partial list of basic
requirements follows. Calculus: Differentiation, chain rule,
integration. Linear algebra: Matrix multiplication, inverse,
pseudo-inverse. Probability theory: Conditional probability, Bayes
rule, conditional expectations. While AI is listed as a pre-requisite,
if any aspect of AI turns out to be required, it will be taught in
class in order to make the course self-contained.
- Homeworks/programs
will be assigned bi-weekly. If you do not
have any prior numerical computing experience, I suggest you use MATLAB
for the programs.
- First
midterm will be held on Wednesday, March 5th, 2008 from 6PM-12
midnight and the second
will be held on Wednesday, April 23rd, 2008 from 6PM-12
midnight.
- The project is due at the end of the semester.
Depending on
the number of students, the project will be
either in teams of two or individual.
- A
set of informal notes which will evolve with the course can
be found here.
Syllabus
- Probability, decision and information theory review.
- Linear regression and classification.
- Multi-layer perceptrons and neural networks.
- Kernel methods.
- Graphical models.
- Mixture models and Expectation-Maximization (EM).
- Sampling methods and Markov Chain Monte Carlo (MCMC).
- Special topics such as hidden Markov models, product of
experts etc.