CIS 6930: Special Topics in Machine Learning
Schedule: MWF, 7th Period
Location: CSE E107
Texts:
- Required:
Information Theory, Inference and Learning Algorithms, David J. C. MacKay, Cambridge University Press; 1st edition, 2002.
- Recommended:
Statistical
Learning Theory,
Vladimir
N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
- Other
Material:
Notes and papers from the
following: Neural Computation
Instructor:
Prof. Anand Rangarajan, CSE
E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu
Office hours: MWF 3-4PM or
by
appointment.
Grading:
- Homeworks: 25%.
- Two Midterms: 25% each.
- Project: 25%
Notes:
- Prerequisites:
Machine Learning (CAP 6610) or Neural Networks for Computing (CAP 6615). A familiarity with basic concepts in calculus,
linear algebra, and probability theory. A partial list of basic
requirements follows. Calculus: Differentiation, chain rule,
integration. Linear algebra: Matrix multiplication, inverse,
pseudo-inverse. Probability theory: Conditional probability, Bayes
rule, conditional expectations.
- Homeworks/programs
will be assigned bi-weekly. If you do not
have any prior numerical computing experience, I suggest you use MATLAB
for the programs.
- A
set of informal notes which will evolve with the course can
be found here.
Syllabus
- Probability, Entropy and Inference: entropy, basics
of information theory, statistical mechanics and information geometry,
information divergence measures, Fisher information, Riemannian
metrics, manifold learning and nonlinear dimensionality reduction
- Expected and Empirical risk: law of large numbers and statistical inequalities, learning and generalization, kernel methods
- Bayesian networks: belief propagation, Bethe and Kikuchi approximate inference, Markov Chain Monte Carlo (MCMC)