CMSC 426
Image Processing (Computer Vision)
David Jacobs

Spring 2017

 

s.umd.edu/~djacobs/CMSC426/checkershadow-AB.jpg

Announcements

General Information

 

 

Class Time 

Tuesday, Thursday 9:30-10:45

Room

CSI 1121

Text

There is no assigned text, but you will receive course notes, and may want to consult three optional books:

  • A draft of Richard Szeliski's new computer vision book is available online. (Nice book, though aimed at a more advanced audience. I highly recommend you download – itís free!)
  • Introductory Techniques for 3-D Computer Vision by Trucco and Verri. Very nice introductory text with clear descriptions, although itís a bit old and doesnít cover all the material in our class.
  • Computer Vision: A Modern Approach by Forsyth and Ponce. More of a graduate text, but widely used.

      Graduate students taking this course may, in particular, want to purchase Forsyth and Ponce.

Personnel

 

 

Instructor

Teaching Assistant

Name

David Jacobs

Xitong Yang

Email

djacobs (at) cs

xtyang35 (at) gmail

Office

AVW 4421

AVW 4103

Office hours

Tuesday 2-3

Wed. 2-3

M: 3-5

If you cannot make these office hours, please feel free to send email to arrange another time to meet. 

 

 

Course Policies

 

 

 

 

 

Course work, late policies, and grading

There will be (probably) eight problem sets assigned during the semester. Six of these will include programming assignments to be done in Matlab. Matlab is free to download throughout the university. Students can download and install matlab through
http://www.it.umd.edu/techsavings/software.html

Some of these will also include pencil and paper exercises. Two problem sets will be purely pencil and paper exercises. Students will have one or two weeks for each problem set, depending on its size.We will also have some in-class workshops, in which students work in groups on problems related to recently discussed material.Attendance is required.In addition there will be a computer vision workshop at the University of Maryland in February. Students are required to attend at least one talk, and write a one paragraph summary of the talk. Students not able to attend the workshop may write a summary of a published research paper on computer vision that is approved by Prof. Jacobs.

 

Homework is due at the start of class. Problems may be turned in late, with a penalty of 10% for each day they are late, but may not be turned in after the start of the next class after they are due. For example, if a problem set is due on Tuesday, it may be turned in before Wednesday at 9:30, with a 10% penalty, or before Thursday at 9:30, with a 20% penalty, but no later than Thursday at 9:30.

There will be three exams. Early in the semester, there will be a short, in-class midterm that will take about half the class. Later, there will be a midterm that will be in class for a complete class period. The final will be during the normal final exam time. Each exam will be cumulative, covering all material learned to that point in the class, with a greater emphasis on material learned since the previous exam. Exams and problem sets will all be based on material discussed in class. Readings are available to help students understand this material, but students will not be expected to master any material not discussed in class.

These will be weighted for the final grade as: Homework 35%, summary of talk, in class workshops, 4%, first midterm 9% second midterm 18%, final exam 34%.

Homework assignments are to be written up neatly and clearly, and programming assignments must be clear and well-documented. Programs should be written in Matlab. A full paper copy of all of the homework must be turned in. In addition, we will ask you to email a copy of all code to the TA.

Some homeworks and projects may have a special challenge problem. Points from the challenge problems are extra credit. This means that I do not consider these points until after the final course grade cutoffs have been set. Students participating in class discussion or asking good questions will also receive extra credit.

 

 

Academic Honesty

All class work is to be done independently. You are allowed to discuss class material, homework problems, and general solution strategies with your classmates. When it comes to formulating/writing/programming solutions you must work alone. If you make use of other sources in coming up with your answers you must cite these sources clearly (papers or books in the literature, friends or classmates, information downloaded from the web, whatever).

It is best to try to solve problems on your own, since problem solving is an important component of the course. But I will not deduct points if you make use of outside help, provided that you cite your sources clearly. Representing other people's work as your own, however, is plagiarism and is in violation of university policies. Instances of academic dishonesty will be dealt with harshly, and usually result in a hearing in front of a student honor council, and a grade of XF. (Note, this and other course policies are taken from those of Prof. David Mount).



Absences

Any student who needs to be excused for an absence from a single lecture, recitation, or lab due to a medically necessitated absence shall: a) Make a reasonable attempt to inform the instructor of his/her illness prior to the class. b) Upon returning to the class, present their instructor with a self-signed note attesting to the date of their illness. Each note must contain an acknowledgment by the student that the information provided is true and correct. Providing false information to University officials is prohibited under Part 9(h) of the Code of Student Conduct (V-1.00(B) University of Maryland Code of Student Conduct) and may result in disciplinary action. The self-documentation may not be used for the Major Scheduled Grading Events as defined below and it may only be used for only 1 class meeting (or more, if you choose) during the semester. Any student who needs to be excused for a prolonged absence (2 or more consecutive class meetings), or for a Major Scheduled Grading Event, must provide written documentation of the illness from the Health Center or from an outside health care provider. This documentation must verify dates of treatment and indicate the timeframe that the student was unable to meet academic responsibilities. In addition, it must contain the name and phone number of the medical service provider to be used if verification is needed. No diagnostic information will ever be requested. The Major Scheduled Grading Events for this course include: a) Midterm - October 9 during the lecture period b) Final exam, as given in University schedule.

Academic Accommodations

Any student eligible for and requesting reasonable academic accommodations due to a disability is requested to provide, to the instructor in office hours, a letter of accommodation from the Office of Disability Support Services (DSS) within the first two weeks of the semester.

 

Below is a tentative schedule for problem sets and lectures.

Problem Sets

Assigned

Due

Jan. 31

Feb 7

Problem Set 2: Edge Detection interpolate_gradients.m test_smooth_image.m test_image_gradient.m

test_gradient_magnitude_direction.m swanbw.jpg swanedges.jpg swanedges_h.jpg

 Feb 7

Feb 21

Practice Midterm (not to be turned in) Practice Midterm and Workshop Questions Solutions

Feb 23

Problem Set 3: Interactive segmentation Swan Image Segmented Swan Dog Image Segmented Dog GUI

March 2

March 14

Problem Set 4: Corner detection Swan Image display_best_corners.m Swan Corners

March 14

March 28

Problem Set 5: 3D Geometry

 March 28

April 4

Practice Midterm (not to be turned in) Practice Midterm With Solutions

March 30

Problem Set 6: Structure-from-motion Main code

April 11

April 25

Problem Set 7: Deep learning CelebA svm_gender.m, LBP.m getmapping.m

cnn_gender_init.m cnn_gender.m cnn_gender_setup_data.m cnn_test.m cnn_gender_deploy.m

April 25

May 9

Problem Set 8: Backpropagation and gradient descent Problems will be on the review for the final

May 4

May 11

Review for Final Practice Final Practice Final with Answers

May 9

 

Class slides will typically be posted here, usually minutes before class begins.

 

 

Class Slides

SLIDES

NOTES

READINGS

Class 1: Introduction (Jan 26)

Introduction

 

Class 2: Background Review and Intro to Matlab (Jan 31)

Background Notes Sample Matlab Code

Matlab download


Class 3: Filtering and Correlation(Feb 2)

 

Correlation and Convolution Notes

 

Tutorial from Sussex on Convolution

 

Szeliski, Section 3.2

Class 4: Correlation Continued, 1D Edge Detection (Feb 7)

 

 

Class 5: 2D Edge Detection

(Feb 9)

 

Gradient Notes


Tutorial from Sussex on Gaussian filtering and edge detection

 

Canny Edge Detector, from CVonline

Szeliski, Section 4.2

Class 6: Fourier Analysis.Gradient and correlation Workshop. (Feb 14)

 

Szeliski, Section 3.4

 

 

Class 7: Human Perceptual Grouping (Feb 16)

Slides

"Laws of organization in perceptual forms"

 

 

Class 8: Interactive Segmentation with Graph Cuts (Feb 21)

Slides

Interactive Graph Cuts for Optimal Boundary & Region Segmentation of Objects in N-D Images

 

Class 9: Texture and Color (Feb 23)

 Color Slides

 

Texture Slides

 

The Foundations of Color Measurement and Color Perception by Brian Wandell

 

Class 10: Short Midterm and Review of Results (Feb 28)

 

 

 

Class 11: Optical flow and corner detection (March 2)

 Optical Flow

 Szeliski, Section 4.1, Section 8.4 (up to 8.4.1).

 

Class 12: Perspective, cameras (March 7)

 Perspective Projection and Cameras

 

 

Class 13: Prev. topics contíd (March 9)

Geometry Notes

 

Class 14: SNOW DAY (March 14)

 

  

 

Class 15: Stereo (March 16)

Slides

 

 

Class 16: Stereo contíd (March 28)

 

 

Class 17: Geometric transformations, Geometry Workshop (March 30)

 

 Dave Mount's Lecture Notes for Computer Graphics, Chapters 6, 7 and 8 might be helpful.

 

Class 18: 2 frame structure-from-motion (April 4)

Slides

 Szeliski, Section 7.2

 

Class 19: Midterm (April 6)

 

 

 

Class 20: Midterm post-mortum (April 11)

 

 

 

Class 21: Intro to machine learning (April 13)

 

 A Course in Machine Learning chapters 1 and 2.

 

Class 22: Linear classification (April 18)

 

 A Course in Machine Learning chapter 4.

 

Class 23: CNNs, intro to matconvnet (April 20)

 

Notes from Stanford class

 

Class 24: Applications of CNNs: classification, detection, semantic segmentation. (April 25)

 

 

 

Class 25: Applications of CNNs: Faces (April 27)

Slides

 

 

 

Class 26: Gradient Descent (May 2)

Notes

Notes from Stanford class

 

Class 27: Backpropagation (May 4)

Notes

Notes from Stanford class

 

Class 28: Workshop (May 9)

 

 

 

Class 29: Conclusions (May 11)

 

 

 

 

FINAL May 15 8-10