Welcome to CMSC498V, Advanced Topics in Machine Learning(Fall'18)!
## Description

## Prerequisite

## Grading

### When & where

### Instructors

### Teaching Assistants

### Contact us

Machine learning studies automatic methods for learning to make accurate predictions or useful decisions based on past observations. This course introduces theoretical machine learning, including mathematical models of machine learning, and the design and rigorous analysis of learning algorithms. Likely topics include: bounds on the number of random examples needed to learn; how to boost the accuracy of a weak learning algorithm; method of moments for latent variable models via spectral methods; generalization of deep neural nets.

Here is a tentative list of topics. (Bullets do not correspond precisely to lectures.)

- General introduction; consistency model

- Basic probability

- PAC model; Occam's razor; Chernoff bounds

- Geometric concepts; VC-dimension; upper and lower bounds on sample complexity; Rademacher complexity

- Boosting and margins theory

- Latent variable models

- Method of moments and Matrix/Tensor Decomposition

- Generalization of Neural Networks

- (Tentatively) Reinforcement Learning

Minimum grade of C- in CMSC422 or CMSC498M; permission of CMNS-Computer Science department and instructor.

- Basic machine learning concepts

* Supervised/unsupervised/reinforcement learning

* Classification, Regression, Cross validation, Overfitting, Generalization

* Deep neural networks

- Basic calculus and linear algebra

* Compute (by hand) gradients of multivariate functions

* Conceptualize dot products and matrix multiplications as projections

* Solve multivariate equations using, etc, matrix inversion, etc

* Understand basic matrix factorization

- Basic optimization

* Use techniques of Lagrange multipliers for constrained optimization problems

* Understand and be able to use convexity

- Basic probability and statistics

* Understand: random variables, expectations and variance

* Use chain rule, marginalization rule and Bayes' rule

* Make use of conditional independence, and understand "explaining away"

* Compute maximum likelihood solutions for Bernoulli and Gaussian distributions

Lecture scribing (30%+10%)

Homeworks (30%)

Participation(10%)

Course project(30%)

Tuesday/Thursday 3:30pm--4:45pm

CSI 1121

Furong Huang

Office hours: Monday 5:00 - 6:00 PM AVW 3251

Jingling Li

Office hours: Friday 11:00 AM - 12:00 PM AVW 3212