We propose a hogel-free approach to high-quality true 3D holographic display with accurate depth- and view-dependent effects, and without sacrificing spatio-angular resolution. Existing approaches rely on computing sub-holograms, so-called hogels, for achieving such effects. However, the chosen size of hogels is typically scene-dependent and follows a trade-off between angular and spatial resolution of holographic imagery. We lift these limitations by formulating a holographic forward model and phase retrieval that takes RGB-D light fields as input and directly optimize the target phase, without spatial segmentation into hogels or phase encoding approaches. The proposed method achieves high-quality 3D holograms with accurate parallax and depth focus effects.
Holography is a promising avenue for high-quality displays without requiring bulky, complex optical systems. While recent work has demonstrated accurate hologram generation of 2D scenes, high-quality holographic projections of 3D scenes has been out of reach until now. Existing multiplane 3D holography approaches fail to model wavefronts in the presence of partial occlusion while holographic stereogram methods have to make a fundamental trade off between spatial and angular resolution. In addition, existing 3D holographic display methods rely on heuristic encoding of complex amplitude into phase-only pixels which results in holograms with severe artifacts. Fundamental limitations of the input representation, wavefront modeling, and optimization methods prohibit artifact-free 3D holographic projections in today’s displays. To lift these limitations, we introduce hogel-free holography which optimizes for true 3D holograms, supporting both depth- and view- dependent effects for the first time. Our approach overcomes the fundamental spatio-angular resolution trade-off typical to stereogram approaches. Moreover, it avoids heuristic encoding schemes to achieve high image fidelity over a 3D volume. We validate that the proposed method achieves 10 dB PSNR improvement on simulated holographic reconstructions. We also validate our approach on an experimental prototype with accurate parallax and depth focus effects.
Praneeth Chakravarthula, Ethan Tseng, Henry Fuchs, Felix Heide
Light Fields and Wavefronts
The light field angular rays are nothing but a coarsely sampled continuous wavefront. Therefore, the rays reaching the eye can be thought of as one complex wavefront traveling from the scene and sampled at the eye before getting focused on the retina to form the image.
Inverting Light Field via Continuous Volume Optimization
We optimize the underlying wave field of the target light field via a continuous volume optimization, where wavefronts inverted from the light field over an arbitrary continuous volume are matched with the wavefronts modulated by a phase-only SLM. The phase of a given wavefront manifests as the amplitude over a continuous volume. Therefore, matching the wave evolution over a continuous volume is equivalent to solving for a complex wavefront.
Hogel-free Holography vs Tensor Holography
The state-of-the-art 3D Tensor holography [Shi et al. 2021] does not model physically accurate occlusion. As a result, the holographic projections show visible light leakage from background into the foreground, and results in ringing artifacts at depth discontinuities and occlusion edges, such as at the blades of grass in the current scene. Furthermore, this physically inaccurate modeling results in incorrect defocus effects. Our method eliminates such artifacts by accurately modeling wave propagation at depth discontinuities at the occlusion edges and result in appropriate parallax and defocus effects.
Experimental Validation of Image Quality and Continuous Focus
We demonstrate high-quality 3D holograms using our experimental prototype which validates that the proposed method achieves observable improvements over the previous methods.
Experimental Validation of Parallax Cues
Our hogel-free holography method produces true 3D holograms with accurate parallax effects. For this forest scene, we observe a change in the position of the background trees relative to the foreground grass (orange inset) and a change in the occlusion of the blades of grass (red inset). Please see the Supplementary Video for further visualization of the parallax effects.
 Praneeth Chakravarthula, Yifan Peng, Joel Kollin, Henry Fuchs and Felix Heide. Learned Hardware-in-the-loop Phase Retrieval for Holographic Near-Eye Displays. ACM Transactions on Graphics (TOG), 2020.
 Praneeth Chakravarthula, Yifan Peng, Joel Kollin, Henry Fuchs and Felix Heide. Wirtinger Holography for Near-Eye Displays. ACM Transactions on Graphics (TOG), 2019.