CAP 6615: Neural Networks for Computing
Schedule: MWF, 8th Period
Location: TUR2334
Texts:
- Required: Neural Networks for Pattern Recognition, Chris
Bishop, Publisher: Oxford University Press.
- Recommended: Neural Networks: A Comprehensive Foundation,
Simon Haykin, Publisher: Macmillan.
- Recommended: Pattern Classification, Duda, Hart and
Stork, Publisher: Wiley Interscience.
- Additional: Statistical Learning Theory, Vladimir N.
Vapnik, John Wiley and Sons, New York, 1998.
- Other Material: Notes and papers from the following:
Neural Computation, IEEE Trans. Neural Networks, Neural Networks
Instructor: Prof. Anand Rangarajan, CSE
E352. Phone: 352 392 1507, Fax: 352 392 1220, email:anand@cise.ufl.edu
Office hours: MWF 4-5pm or by appointment.
Grading:
- Homeworks: 20%.
- Two Midterms: 25% each.
- Project: 30%
Notes:
- Prerequisites: A familiarity with basic concepts in calculus,
linear algebra, and probability theory. A partial list of basic requirements
follows. Calculus: Differentiation, chain rule, integration. Linear algebra:
Matrix multiplication, inverse, pseudo-inverse. Probability theory: Conditional
probability, Bayes rule, conditional expectations.
- Homeworks/programs will be assigned bi-weekly. If you do not
have any prior numerical computing experience, I suggest you use MATLAB
for the programs.
- First Midterm will be given approximately at the middle of the
semester and the second will be in the last week of classes.
- The project will be the same for all students. A project demonstration
is due at the end of the semester and will be graded competitively. Depending
on the number of students, the project will be either in teams of two or
individual.
- A set of informal notes which will evolve with the course can
be found here.
Syllabus
- Supervised Learning: linear discriminants, the perceptron,
backpropagation, multi-layer perceptrons, radial basis functions, learning
and generalization theory, support vector machines.
- Density Estimation: finite mixtures, the expectation-maximization
(EM) algorithm, Bayesian networks.
- Unsupervised Learning: competitive networks, clustering,
Kohonen self-organizing feature maps, Hebbian learning, principal and independent
component analysis (PCA and ICA), kernel methods, local linear embeddings.