Dual Exposure Stereo for Extended Dynamic Range 3D Imaging

  • Juhyung Choi

  • Jinnyeong Kim

  • Jinwoo Lee

  • Samuel Brucker

  • Mario Bijelic

  • Felix Heide

  • Seung-Hwan Baek

CVPR 2025

We introduce dual-exposure stereo, a method for extended dynamic range (DR) 3D imaging. In contrast to the single exposure capture shown in (a), we control the dual exposures synchronously set for the stereo camera to expand the effective DR of 3D imaging. From the captured dual-exposure stereo images in (b), we estimate a disparity map in (d) that preserves details in both the under- and over-exposed images—details that cannot be faithfully reconstructed in the one-exposure results shown in (c). LiDAR ground truth is shown in (e).

Achieving robust stereo 3D imaging under diverse illumination conditions is challenging due to the limited dynamic range of conventional cameras, causing existing stereo depth estimation methods to suffer from under- or over-exposed images. In this paper, we propose dual- exposure stereo that combines auto-exposure control and dual-exposure bracketing to achieve stereo 3D imaging with extended dynamic range. Specifically, we capture stereo im- age pairs with alternating dual exposures, which automat- ically adapt to scene illumination and effectively distribute the scene dynamic range across the dual-exposure frames. We then estimate stereo depth from these dual-exposure stereo images by compensating for motion between con- secutive frames. To validate our approach, we develop a robotic vision system, acquire real-world HDR stereo video datasets, and generate additional synthetic datasets. Experimental results demonstrate that our method outperforms existing exposure control methods.

Dual Exposure Stereo for Extended Dynamic Range 3D Imaging

Juhyung Choi, Jinnyeong Kim, Jinwoo Lee, Samuel Brucker, Mario Bijelic, Felix Heide, Seung-Hwan Baek

CVPR 2025

Synthetic Dataset

Synthetic HDR stereo dataset generated using the CARLA simulator. The video shows a sequence of left images and corresponding ground-truth depth maps.

Real Dataset

Real-world stereo-LiDAR dataset captured across diverse scenes. Includes tone-mapped stereo images and sparse LiDAR ground-truth, recorded with a calibrated mobile camera system.

Auto Dual Exposure Control

The video showcases our Auto Dual Exposure Control (ADEC). Our Auto Dual Exposure Control (ADEC) algorithm analyzes the histogram and skewness of two captured images to adaptively adjust exposure values based on scene characteristics. If the scene’s dynamic range exceeds that of the camera, ADEC increases the exposure gap to capture complementary highlight and shadow details. If the scene’s dynamic range is within the camera’s limits or the case is uncertain, the algorithm adjusts each exposure toward a balanced state by minimizing skewness. This dynamic adjustment allows ADEC to optimize exposure distribution, improving performance in downstream tasks such as depth estimation in extreme dynamic range scenes.

Dual-exposure Stereo Estimation

This video illustrates our dual-exposure stereo disparity estimation. We use two stereo image pairs with different exposures to handle challenging lighting conditions. To compensate for object and camera motion between frames, we estimate optical flow and temporally align features. By fusing features from both exposures based on pixel intensity, we construct disparity volumes that capture both highlight and shadow details. This enables robust depth estimation even under extreme dynamic range.

Extended Dynamic Range 3D Imaging

The video showcases a side-by-side comparison of image capture and depth estimation results using AverageAE and our proposed ADEC method:
(a) AverageAE-captured video: mimics the behavior of conventional auto-exposure algorithms used in standard cameras.
(b) ADEC-captured video: generated by our proposed dual-exposure control method that adaptively adjusts exposure to better cover the scene’s dynamic range.
(c) Disparity map estimated from the AverageAE input.
(d) Disparity map estimated from the ADEC input.
(e) Pixel intensity histogram of ADEC exposures (orange and purple), overlaid on the scene radiance computed from the raw HDR. This visualization demonstrates how ADEC captures complementary brightness regions, improving depth estimation.

Related Publications

[1] Stefanie Walz, Mario Bijelic, Andrea Ramazzina, Fahim Mannan, Felix Heide. Gated Stereo: Joint Depth Estimation from Gated and Wide-Baseline Active Stereo Cues, CVPR 2023.

[2] Samuel Brucker, Stefanie Walz, Mario Bijelic, Felix Heide, Cross-Spectral Depth Estimation from RGB and NIR-Gated Stereo Images, CVPR 2024.

[3] Anush Kumar, Fahim Mannan, Omid Hosseini Jafari, Shile Li, Felix Heide, Flow-Guided Online Stereo Rectification for Wide Baseline Stereo, CVPR 2024.