This course is an elementary introduction to a machine learning technique called deep learning, as well as its applications to a variety of domains. Along the way, the course also provides an intuitive introduction to machine learning such as simple models, learning paradigms, optimization, overfitting, importance of data, training caveats, etc. The assignments explore key concepts and simple applications, and the final project allows an indepth exploration of a particular application area. By the end of the course, you will have an overview on the deep learning landscape and its applications. You will also have a working knowledge of several types of neural networks, be able to implement and train them, and have a basic understanding of their inner workings.
ESJ 2204
Tuesday, Thursday 2:00pm  3:15pm
Abhinav Shrivastava
abhinav@cs.umd.edu
Office hours: Tuesday, 4:00pm  5:00pm (or by email)
4238 IRB
Pulkit Kumar
pulkit@cs.umd.edu
Office hours: Thursday, 4:00pm  5:00pm (or by email)
4232 IRB
Mara Levy (Tentative)
mlevy@umd.edu
Date  Topic  Slides  Notes & Assignments  

January 28 January 30 
Course Introduction
Motivation Goals, Syllabus, Assignments Policies 
slides 

Machine Learning Basics  I


January 30 
Introduction to Statistical Learning
Simple models Paradigms of learning 
slides  
Neural Networks Basics  I


February 4 library_books 
Introduction to Neural Networks
Terminology Simple Neural Networks Nonlinearities 
slides  Assignment 1 out  
February 6 
Problem setup: labels and losses
Types of problems and labels Loss functions 
slides  
Computation Resources Overview


February 11 library_books 
Computational Infrastructure (TA Lecture)
Quick walkthrough: Colaboratory, Google Cloud Platform
Introduction to PyTorch

iPython  Assignment 1 due  
February 13 
Neural Network Runthrough (TA Lecture)
Handling data (images, text) Structuring a neural network and machine learning codebase Introduction to class challenges 

Machine Learning Basics  II


February 18 February 20 
Optimization basics
Loss function derivatives, minimasGradient descent, Stochastic gradient descent 
slides  
Neural Network Basics  II


Febuary 25 Febuary 27 
Training Neural Networks
InitializationBackpropagation Optimization & hyperparameters 
slides (1) slides (2) 

Febuary 27 
Training Caveats (Neural Networks and ML models)
OverfittingBias/Variance tradeoff Optimization & hyperparameters 
(see above)  
March 3 
Improving Performance of Neural Networks
Optimization tips and tricksBest principles 
(see above)  
March 5  No Class
ECCV deadline (Wish us luck!)


March 10  Midterm exam
In class


Convolutional Neural Networks (ConvNets)


March 12 March 31 
Introduction to ConvNets
ConvolutionsPooling 
slides  
March 17 March 19 March 24 March 26 
Extended Spring Break due to CoVid19  
March 31 April 2 April 7 
ConvNet Architectures
Popular architectures (primarily images, brief overview of videos)Intuitions and keyinsights Design principles 
(see above)  
Inverting ConvNets Saliency maps Visualizing neurons 

Applications of ConvNets


April 7 April 9 
Application I: Object Detection  slides  
April 9 April 14 
Application II: Dense Prediction  slides  
Schedule below is tentative, and will evolve as we move to the virtual setup.


Recurrent Neural Networks (RNNs)


April 14 April 16 
Introduction to Recurrent Networks
RNNs, GRUs, LSTMsText/Language Applications Language Modelling 
slides  
April 28 April 30 library_books 
Introduction to Selfattention or Transformers
Selfattenton or Transformers 
slides  Bonus Assignment out  
Advanced Topics


April 30 May 5 
Vision + Language (models, tasks, training)  (see above)  
May 5 May 7 May 12 
Image Generative Models
Auto regressive modelsGANs, pix2pix, CycleGAN, etc. VAEs Teasers: Text Generation, Selfsupervised Learning 
slides  
Selfstudy  (A brief) Introduction to (Deep) Reinforcement Learning  slides  
Selfstudy May 12 
Ethics and Bias; Epilogue  slides  
Monday May 18 
Final Exams 
Minimum grade of C in CMSC330 and CMSC351; and 1 course with a minimum grade of C from (MATH240, MATH461); and permission of CMNSComputer Science department.
We will work extensively with probability, statistics, mathematical functions such as logarithms and differentiation, and linear algebra concepts such as vectors and matrices. You should be comfortable manipulating these concepts.
We will use of the Python programming language. It is assumed that you know or will quickly learn how to program in Python. The programming assignments will be oriented toward Unixlike operating systems. While it is possible to complete the course using other operating systems, you will be solely responsible for troubleshooting any issues you encounter.
If you are unsure that you have the required prerequisites, consult with the instructor.
Here's how you will be graded:
Collaboration: Students are expected to finish the homeworks by himself/herself, but discussion on the assignments is allowed (and encouraged). The people you discussed with on assignments should be clearly detailed: before the solution to each question, list all people that you discussed with on that particular question. In addition, each student should submit his/her own code and mention anyone he/she collaborated with.
Details will be announced in class.
Syllabus subject to change.
The course should be self contained, but if you need additional reading material, you can consult the following:
If you need reference/additional readings for statistical learning, you can consult the following:
Note that academic dishonesty includes not only cheating, fabrication, and plagiarism, but also includes helping other students commit acts of academic dishonesty by allowing them to obtain copies of your work. In short, all submitted work must be your own. Cases of academic dishonesty will be pursued to the fullest extent possible as stipulated by the Office of Student Conduct. It is very important for you to be aware of the consequences of cheating, fabrication, facilitation, and plagiarism. For more information on the Code of Academic Integrity or the Student Honor Council, please visit http://www.shc.umd.edu.
Any student who needs to be excused for an absence from a single lecture, recitation, or lab due to a medically necessitated absence shall:
Any student who needs to be excused for a Major Scheduled Grading Event, must provide written documentation of the illness from the Health Center or from an outside health care provider. This documentation must verify dates of treatment and indicate the time frame that the student was unable to meet academic responsibilities. No diagnostic information shall be given. The Major Scheduled Grading Events for this course include midterm and final exam. For class presentations, the instructor will help the student swap their presentation slot with other students.
It is also the student's responsibility to inform the instructor of any intended absences from exams and class presentations for religious observances in advance. Notice should be provided as soon as possible, but no later than the Monday prior to the the midterm exam, the class presentation date, and the final exam.
Any student eligible for and requesting reasonable academic accommodations due to a disability is requested to provide a letter of accommodation from the Office of Disability Support Services within the first three weeks of the semester.
You can find the university’s course policies here.