CAP 6615: Neural Networks for Computing
Schedule: MWF, 8th Period
Location: CSE E122
Texts:
- Recommended:
Neural
Networks for Pattern Recognition,
Chris
Bishop, Publisher: Oxford University Press, 1995.
- Recommended:
Neural
Networks: A Comprehensive
Foundation, Simon
Haykin, Publisher: Prentice Hall, second
edition, 1998.
- Recommended:
Pattern
Classification,
Duda, Hart
and Stork, Publisher: Wiley Interscience, second edition, 2000.
- Additional:
Statistical
Learning Theory,
Vladimir
N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
- Other
Material:
Notes and papers from the
following: Neural Computation,
IEEE Trans. Neural Networks, Neural
Networks
Instructor:
Prof. Anand Rangarajan, CSE
E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu
Teaching Assistant: Mingxi Wu, Office
location: TBA, email: mwu@cise.ufl.edu
Office hours: Anand,
MW 4-5PM and F 2-3PM or
by
appointment. Mingxi, TR 4-5PM, F 1-2PM.
Grading:
- Homeworks: 20%.
- Two Midterms: 20% each.
- Two Projects: 20% each
Notes:
- Prerequisites:
A familiarity with basic concepts in calculus,
linear algebra, and probability theory. A partial list of basic
requirements follows. Calculus: Differentiation, chain rule,
integration. Linear algebra: Matrix multiplication, inverse,
pseudo-inverse. Probability theory: Conditional probability, Bayes
rule, conditional expectations. While AI is listed as a pre-requisite,
if any aspect of AI turns out to be required, it will be taught in
class in order to make the course self-contained.
- Homeworks/programs
will be assigned bi-weekly. If you do not
have any prior numerical computing experience, I suggest you use MATLAB
for the programs.
- First
midterm will be held on October 17th, 2005 from 6PM-12 midnight in CSE E404 and the second
will be held on December 7th, 2005 from 6PM-12 midnight.
- The
two
projects will be the same for all students. The first project is due
November 2nd 2005 and the second is due December 2nd 2005. Depending on
the number of students, the project will be
either in teams of two or individual.
- A
set of informal notes which will evolve with the course can
be found here.
Syllabus
- Supervised
Learning: linear discriminants,
the
perceptron, backpropagation, multi-layer perceptrons, radial basis
functions, learning and generalization theory, support vector machines.
- Density
Estimation: finite Gaussian
mixtures, the
expectation-maximization (EM) algorithm.
- Unsupervised
Learning: competitive networks,
clustering, Kohonen self-organizing feature maps, principal and
independent component analysis (PCA and ICA), local linear
embeddings (LLE), ISOMAP.