HDR-NeRF: High Dynamic Range Neural Radiance Fields

CVPR 2022


Xin Huang1, Qi Zhang2, Ying Feng2, Hongdong Li3, Xuan Wang2, Qing Wang1

1Northwestern Polytechnical University    2Tencent AI Lab     3Australian National University

Abstract


Architechture

We present High Dynamic Range Neural Radiance Fields (HDR-NeRF) to recover an HDR radiance field from a set of low dynamic range (LDR) views with different exposures. Using the HDR-NeRF, we are able to generate both novel HDR views and novel LDR views under different exposures. The key to our method is to model the physical imaging process, which dictates that the radiance of a scene point transforms to a pixel value in the LDR image with two implicit functions: a radiance field and a tone mapper. The radiance field encodes the scene radiance (values vary from 0 to +infty), which outputs the density and radiance of a ray by giving corresponding ray origin and ray direction. The tone mapper models the mapping process that a ray hitting on the camera sensor becomes a pixel value. The color of the ray is predicted by feeding the radiance and the corresponding exposure time into the tone mapper. We use the classic volume rendering technique to project the output radiance, colors, and densities into HDR and LDR images, while only the input LDR images are used as the supervision. We collect a new forward-facing HDR dataset to evaluate the proposed method. Experimental results on synthetic and real-world scenes validate that our method can not only accurately control the exposures of synthesized views but also render views with a high dynamic range.


Pipeline Overview


Architechture

The pipeline of HDR-NeRF modeling the physical process. Our method is consisted of two modules: an HDR radiance field models the target scene for radiance and densities and a tone mapper models the CRF for colors.


Novel LDR views


HDR-NeRF produces high-fidelity LDR views with varying exposures.


Novel HDR views (Tone-mapped)


Our HDR-NeRF is able to render novel HDR views. The tone-mapped HDR views reveal the details of over-exposure and under-exposure areas.


Comparisons of novel LDR/HDR views



Results and Comparisons



Citation


@inproceedings{huang2022hdr,
  title={Hdr-nerf: High dynamic range neural radiance fields},
  author={Huang, Xin and Zhang, Qi and Feng, Ying and Li, Hongdong and Wang, Xuan and Wang, Qing},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={18398--18408},
  year={2022}
}

More


Xin Huang et al. Inverting the Imaging Process by Learning an Implicit Camera Model. CVPR 2023. [Project Page]
Xin Huang et al. Local Implicit Ray Function for Generalizable Radiance Field Representation. CVPR 2023. [Project Page]