This New Algorithm Lets You Explore Virtual Reality by Walking Naturally
University of Maryland researchers are developing a novel algorithm that significantly improves the virtual reality (VR) experience by redirecting users to avoid colliding with physical objects.
The project is led by third-year computer science doctoral student Niall Luke Williams, a researcher in the Geometric Algorithms for Modeling, Motion, and Animation (GAMMA) lab. His advisers are Distinguished University Professor Dinesh Manocha and assistant research professor Aniket Bera, both of whom hold appointments in the University of Maryland Institute for Advanced Computer Studies.
“Exploration is a key component of creating immersive experiences in VR,” says Williams. “There is a need to develop different ways for users to easily and naturally explore these expansive virtual environments.”
Commonly used techniques include teleportation—where the user selects a spot to be transported to within the virtual world (like navigating Google Maps in street view)—as well as walking in place, joystick controls, and omnidirectional treadmills. These methods all have their disadvantages and advantages when it comes to the user’s sense of presence and ability to easily explore the virtual world, says Williams.
However, he prefers redirected walking—a technique that allows the user to explore the virtual world by walking naturally—since it’s the most intuitive and comfortable. Yet it’s technically challenging to implement due to users’ varied physical environments.
To create a safer and more immersive virtual experience using redirected walking, Williams designed the alignment-based redirection controller (ARC), an algorithm that directs users to the best aligned path that matches in both the user’s virtual and physical world.
Using techniques from robot motion planning, ARC lays out the best route for the user to avoid obstacles. By slightly rotating the virtual environment, it encourages the user to subconsciously alter their trajectory and redirects them around objects to the selected path.
If the rotations are small and slow enough, the user doesn’t notice them. But if the rotations are too long or rapid, it will have the opposite effect and hinder the user experience.
Williams tested ARC in three different environments with two types of controllers. He found that it performed considerably better than existing state-of-the-art algorithms in every context.
“This is significant because it goes against what we used to think is the case,” he says. “By leveraging this idea of alignment, measuring the differences between the physical and virtual worlds, and incorporating that into our algorithms, we can achieve significant benefits for both the algorithm performance as well as the user experience.”
Earlier this year at the IEEE Conference on Virtual Reality and 3D User Interfaces, Williams received Honorable Mention for the first accepted paper toward his dissertation, “ARC: Alignment-based Redirection Controller for Redirected Walking in Complex Environments.”
In August, he presented the same paper at SIGGRAPH VR 2021—the premier conference for computer graphics and interactive techniques worldwide. A video of his full presentation is available here.
In October, Williams will present a related paper titled "Redirected Walking in Static and Dynamic Scenes Using Visibility Polygons" at the IEEE International Symposium on Mixed and Augmented Reality. All three conferences are virtual due to the ongoing COVID-19 pandemic.
—Story by Maria Herd
The Department welcomes comments, suggestions and corrections. Send email to editor [-at-] cs [dot] umd [dot] edu.