Class Information

Instructional Team


TuThu 12:30pm-1:45pm


IRB 2207


CMSC330 and CMSC351; CMSC425 or CMSC427 (Optional)

Class Description

AR, VR, and MR, collectively referred to as XR, are becoming ubiquitous for human-computer interaction with limitless applications and potential use. This course examines advances on real-time multi-modal XR systems in which the user is 'immersed' in and interacts with a simulated 3D environment. The topics will include display, modeling, 3D graphics, haptics, audio, locomotion, animation, applications, immersion, and presence and how they interact to create convincing virtual environments. We'll explore these fields and the current/future research directions.

By the end of the class, students will be understand a wide range of research problems in XR, as well as be able to understand the future of XR as a link between physical and virtual worlds as described in the "Metaverse" visions.

The class assignments will involve interacting with the above topics in the context of a real XR device, with the goal of building the skills to develop powerful multi-modal XR applications. Assignments may be completed in Unity or Unreal Engine 4, but the TAs can only provide technical support for Unity.

Prior knowledge or experience with game development or XR is not necessary to succeed in the course! Unity uses object-oriented programming, with the technical learning experience of this class focusing on APIs and application design rather than programming skill, so intermediate experience with Java, C#, C++, etc. is sufficient.

Students will have access to Oculus Quest VR headsets (see image below) for this course, which they will be expected to use to complete most of the assignments and final project. Details on the HMD rental system will be announced later.

  • Oculus Quest

    Oculus Quest

The final class project will have you apply the skills developed from assignments by exploring, implementing, and/or and analyzing a recent XR research topic.

The undergraduate & graduate sections contain the same content and assignments, although the graduate final project is expected to be more substantial and explore beyond the state-of-the-art research more than the undergraduate version. There will also be some differences in assignment requirements and exams.

Qualifying Areas

Undergraduate: Areas 2 (Info Processing) & 3 (Software Engineering/Programming Languages)
Graduate: Software Engineering/Programming Languages/HCI

Grade Breakdown

  • Midterm & Final Exams (35%)
  • 4+1 Homework/Programming Assignments (30%)
  • Final Course Project (35%)

Planned Lectures & Topics

Planned Assignments

The assignments will likely build on each other such that they will result in a complete multimodal XR application. The assignments will include specific requirements for the rubric, but there will be some creative elements and most students will end up with interesting & unique XR apps rather than specific environments. The resulting system may be useful as a baseline application for the final project.

  1. Introduction to XR and game engines
  2. Interaction with virtual environments
  3. 3D audio for virtual environments
  4. Natural virtual locomotion
  5. Virtual avatars and agents
  6. Introduction to augmented reality

Midterm & Final Exams

The exams will contain high-level critical thinking questions about problems throughout the subfields of XR. They will generally be short answer questions asking about how you may approach a particular problem.

Final Project

The final project will consist of exploring, implementing, and/or evaluating a research prototype in XR. Projects can explore applications of XR (e.g. focusing on human factors and user interaction) or investigate more fundamental XR problems, like body/eye tracking, computer vision, multimodal display and rendering, sensors, etc. Students may work in small groups or alone, with teams expected to have more substantial contribution.
Please see more details here.