Skip to content
Snippets Groups Projects
user avatar
Azur Ponjavic authored
90888a29
History

Virtual Reality Rendering Framework

This framework is based on liblava and was extended to include features necessary for developing and evaluating new stereo rendering strategies. A special focus of the framework lies on remote rendering for standalone consumer HMDs over WiFi.

In order to support multiple APIs as well as local and remote rendering a Headset interface was introduced with implementations for

Rendering

This framework is intended to investiage different stereo rendering strategies. Thus, a stereo strategy interface was created to easiliy switch between different strategies and compare them. Currently, there exist the following stereo strategies:

In general, the framework supports shadow mapping and approximate global illumination using light propagation volumes. For evaluation of the performance and quality of the different rendering techniques the framework provides utility functions for measuring gpu times and capturing images for an external comparison to a ground truth.

Depth Peeling Reprojection

Depth Peeling Reprojection is a rendering technique that aims to reduce the duplicate shading that occurs when rendering images for the left and right eye in virtual reality applications. Instead of rendering the scene from two perspectives, it will render the first two layers from a single perspective similar to Mara et. al, Deep G-Buffers for Stable Global Illumination Approximation. The goal of this approach is to have more information available when reprojecting the resulting images and, thus, having less artifacts due to disoccluded regions. Especially when considering streaming the result wirelessly to remote clients it is critical to have reprojection strategies that can handle lost or delayed frames nicely.