Human-Computer Interaction

Improved Filtering for Rendering Virtual Reality Photos (VR180, VR360)


This project is already assigned.

Motivation

With affordable high-quality standalone head-mounted displays (HMD), virtual reality is ready to move into the mass consumer market. With modern VR-cameras, images can be captured with a single camera shot covering the full sphere with 360x180 degrees around the camera (VR360). Other cameras have two wide-angle, forward-facing cameras and can capture stereoscopic content covering a half-sphere with 180x180 degrees. The two most common formats for storage are either fisheye or equirectangular.


Figure 1: example of two 190° (covering more than the required 180°) fisheye shots for stereo
Figure 2: example of a VR180 stereo image with each half in an equirectangular format covering 180°x180° of content.


When rendering these images in virtual reality, the images, e.g., from the equirectangular format, are rendered on a virtual sphere for VR360 or a half-sphere for VR180. The equirectangular format, comparable to a world map projected into a rectangle, has various distortions.


Figure 3: Tissot’s Indicatrices of Distortion. (Source: https://en.wikipedia.org/wiki/File:Plate_Carr%C3%A9e_with_Tissot%27s_Indicatrices_of_Distortion.svg, image can be freely used with the link to the license Creative Commons Attribution-Share Alike 4.0 International)


To avoid aliasing during rendering, the technique of mipmapping and trilinear filtering is often used.

Figure 4: Mip maps for a checkerboard texture (Source: https://textureingraphics.wordpress.com/what-is-texture-mapping/anti-aliasing-problem-and-mipmapping/)


Mip mapping and trilinear filtering can reduce rendering artifacts from undersampling.


Figure 5: Moiré patterns on a rendering of a checkerboard floor (Source: https://textureingraphics.wordpress.com/what-is-texture-mapping/anti-aliasing-problem-and-mipmapping/)


However, the assumption for mip mapping is based on textures which have the same amount of detail in all areas. As shown in the Tissot’s matrix in Figure 3, this is NOT the case for equirectangular images. Therefore, standard methods for rendering equirectangular (and fisheye) content in VR often over- and undersamples the content, leading to undesired blur or high frequency flickering and moiré effects.

Tasks

During this project, an improved rendering method shall be developed which will render VR180 and VR360 images in an ideally sampled way, most likely to be implemented in a shader program. The final version should run on a modern virtual reality head-mounted display and automatically consider the input image resolution and the HMD display resolution including the internal oversampled rendering resolution for an ideal display of such media.

Prerequisites

• Excellent skills in C++ or C#, and in Unity

• Math knowledge for understanding the fisheye and equirectangular projection formats

• Familiarity with Virtual Reality devices

• Interest in VR photography is a plus

Execution

The project would be executed together with the VR startup “immerVR GmbH” near Erlangen. It could be in the form of an industry cooperation for a Master’s thesis or an internship (“Pflichtpraktikum”) starting now.



Contact Persons at the University Würzburg

Prof. Dr. Sebastian von Mammen (Primary Contact Person)
Mensch-Computer-Interaktion, Universität Würzburg
sebastian.von.mammen@uni-wuerzburg.de

Daniel Pohl, CEO immerVR GmbH (Primary Contact Person)
Contact through website

Legal Information