PhD Proposal: Foveated rendering in virtual environments

Talk
Xiaoxu Meng
Time: 
05.06.2019 16:30 to 18:30
Location: 

IRB 3256

Foveated rendering coupled with eye-tracking has the potential to dramatically accelerate interactive 3D graphics with minimal loss of perceptual detail. The eye-tracking-guided kernel foveated rendering can resolve the mutually conflicting goals of interactive rendering and perceptual realism.First, I present a new foveated rendering technique: Kernel Foveated Rendering (KFR), which parameterizes foveated rendering by embedding polynomial kernel functions in log-polar space. This GPU-driven technique uses parameterized foveation that mimics the distribution of photoreceptors in the human retina. I present a two-pass kernel foveated rendering pipeline that maps well onto modern GPUs. I have carried out user studies to empirically identify the KFR parameters and have observed a 2.8x-3.2x speedup in rendering on 4K displays.Second, I explore the rendering acceleration through foveation for 4D light fields, which captures both the spatial and angular rays, thus enabling free-viewpoint rendering and custom selection of the focal plane. I optimize the KFR algorithm by adjusting the weight of each slice in the light field, so that it automatically selects the optimal foveation parameters for different images according to the gaze position. I have validated our approach on the rendering of light fields by carrying out both quantitative experiments and user studies. Our method achieves speedups of 3.47x-7.28x for different levels of foveation and different rendering resolutions.This proposal will also explore a variety of other human visual system features and how we can use them with foveated rendering to further accelerate rendering for virtual environments. I plan to validate these with real-world applications of virtual environments in surgery and education.
Examining Committee:

Chair: Dr. Amitabh Varshney Dept rep: Dr. Joseph F. JaJa Members: Dr. Matthias Zwicker