Snow

Welcome!


Class Information

Instructional Team

Time/Date

TuThu 3:30pm-4:45pm

Room

IRB 2207

Prerequisites

CMSC 330 and CMSC 351; CMSC427 (Preferred)

Class Description

AR, VR, and MR, collectively referred to as XR, are becoming ubiquitous for human-computer interaction with limitless applications and potential use. This course examines advances on real-time multi-modal XR systems in which the user is 'immersed' in and interacts with a simulated 3D environment. The topics will include display, modeling, 3D graphics, haptics, audio, locomotion, animation, applications, immersion, and presence and how they interact to create convincing virtual environments. We'll explore these fields and the current/future research directions.


By the end of the class, students will be understand a wide range of research problems in XR, as well as be able to understand the future of XR as a link between physical and virtual worlds as described in the "Metaverse" visions.

The class assignments will involve interacting with the above topics in the context of a real XR device, with the goal of building the skills to develop powerful multi-modal XR applications. Assignments may be completed in Unity or Unreal Engine 4, with our full technical support for either one.

Prior knowledge or experience with game development or XR is not necessary to succeed in the course! Unity and UE4 both use object-oriented programming, with the technical learning experience of this class focusing on APIs and application design rather than programming skill, so intermediate experience with Java, C#, C++, etc. is sufficient.

Most students will work with the headset provided (shown left below), which allows for development on one's own mobile phone. It contains an iOS/Android-compatible remote controller, and the back is open, exposing the smartphone camera to allow for AR and vision-app development. Students may also use their own devices, such as an Oculus Quest (or any other HMD that can provide hand-tracking and controller input), which should not affect the assignment difficulty.

  • Mobile HMD

    Mobile HMD we provide

  • Oculus Quest

    Oculus Quest

The final class project will have you apply the skills developed from assignments by exploring, implementing, and/or and analyzing a recent XR research topic.

The undergraduate & graduate sections contain the same content and assignments, although the graduate final project is expected to be more substantial and explore beyond the state-of-the-art research more than the undergraduate version. There will also be some differences in assignment requirements and exams.


Qualifying Areas

Undergraduate: Areas 2 (Info Processing) & 3 (Software Engineering/Programming Languages)
Graduate: Software Engineering/Programming Languages/HCI

Grade Breakdown

  • Midterm & Final Exams (35%)
  • 4-5 Homework/Programming Assignments (30%)
  • Final Course Project (35%)

Planned Lectures & Topics

This is the tentative lecture plan. Please see the "Calendar" tab for interesting visuals of the topics that will be covered.

Planned Assignments

The assignments will likely build on each other such that they will result in a complete multimodal XR application. The assignments will include specific requirements for the rubric, but there will be some creative elements and most students will end up with interesting & unique XR apps rather than specific environments. The resulting system may be useful as a baseline application for the final project.

  1. Setting up the game engine for XR, making sure you can build on the headset, basic game development.
  2. Interacting with virtual objects and scenes (picking objects up, teleporting with navigation mesh, basic VR math).
  3. Basic 3D modelling and importing it in your scene
  4. Hand-tracking & hand interaction with your virtual environment.
  5. Natural virtual locomotion (implementing redirected walking & translational gain).
  6. Adding sound propagation to your environment.
  7. Basic inverse kinematics (allowing you to commandeer a virtual avatar).
  8. Virtual agents (adding autonomous virtual humans to the environment)

Midterm & Final Exams

The exams will contain high-level critical thinking questions about problems throughout the subfields of XR. They will generally be short answer questions asking about how you may approach a particular problem.

Final Project

The final project will consist of exploring, implementing, and/or evaluating a research prototype in XR. Projects can explore applications of XR (e.g. focusing on human factors and user interaction) or investigate more fundamental XR problems, like body/eye tracking, computer vision, multimodal display and rendering, sensors, etc. Students may work in small groups or alone, with teams expected to have more substantial contribution.