PhD

My PhD mainly focuses on view interpolation and light field displays. Light fields are a representation of the light that passes through a scene, while light field displays are systems that reconstruct the light field in the real-world. Light field displays can be thought of as advanced 3D displays that simultaneously project a multitude of images into slightly different directions. By doing so, these displays do not only stimulate stereo depth perception, but also allow the stimulation of other depth perception cues, such as eye accommodation (focus on different depths) and motion parallax (the observer can view the scene from different directions).

The work in my PhD is two-fold. At first I worked on depth extraction and view interpolation methods to generate the data that light field displays require based on a limited set of input images. These methods work for so-called multi-view light field displays, as well as integral imaging displays. Both types have a different representation with relation to the required input. Later, I also worked on the creation of a transparent augmented reality light field display, during my internship at the National Institute of Information and Communications Technology (NICT), Japan.

Depth Estimation and View Interpolation for Light Field Displays

View interpolation is the process of creating new images of a scene, located at different positions than the input views. The process extracts geometrical information from the scene based on the input images, and generates new images, based on the geometry and the input images their color information, at new viewpoints.

Within my PhD, I worked on techniques to increase the baseline between the input cameras and to create in-scene view synthesis algorithms that are suitable for integral imaging light field displays.

The videos below give an example of the view synthesis output that can serve for multi-view light field displays. The first video shows the input images, while the second shows the synthesized sequence. One can see that camera movement is much smoother in the second video due to the high amount of generated images.

Input Frames: 6 frames, from left to right. Dataset from Kim, Changil, et al. “Scene reconstruction from high spatio-angular resolution light fields.” ACM Trans. Graph. 32.4 (2013): 73-1.
Interpolated output: 6 frames interpolated to 400 frames.

The following image shows an integral image generated from a subset of Stanford’s “Bulldozer” light field dataset. The small elemental images are generated by putting the virtual camera on the display plane, located inside the scene. The full image should be displayed on a compatible integral imaging light field display to reconstruct the light field and observe the parallax effect. This in-scene view synthesis approach was developed during my internship.

Integral image generated from Stanford’s “Bulldozer” dataset. The display plane was set halfway the bulldozer.

Augmented Reality Light Field Display

This work was achieved during my internships at the National Institute of Information and Communications Technology (NICT), Japan.

The display is a projection based integral imaging light field display that consists of a consumer-grade 4K projector and a Digitally Designed Holographic Optical Element (DDHOE) that acts as the projection screen. The DDHOE resembles a grid of concave micro-mirrors that reflects the light of the different views into the correct direction.

In order to achieve a correct reconstruction of the presented scene, a highly accurate calibration is required to make sure that the projections aligns well with the micro-mirrors. Without such a calibration the reconstruction would contain cracks or it would be distorted due to the effects of the projector lens and the perspective distortion.

Recording of the designed AR light field display illustrating parallax. Supplementary material from our paper “Digitally designed holographic optical element for light field displays”, 2018
Recording of the designed AR light field display. Supplementary material from our paper “Digitally designed holographic optical element for light field displays”, 2018
Recording of the designed AR light field display illustrating eye accommodation at different depth layers. Supplementary material from our paper “Digitally designed holographic optical element for light field displays”, 2018