Relightable 3D Gaussian: Real-time Point Cloud Relighting with BRDF Decomposition and Ray Tracing


Jian Gao1*, Chun Gu2*, Youtian Lin1, Hao Zhu1, Xun Cao1, Li Zhang2 , Yao Yao1
1 Nanjing University, 2 Fudan University
*Equally contributed.

Paper Code

Points PBR with Decomposed BRDF Ambient Occlusion from Point-based Ray Tracing

Abstract



We present a novel differentiable point-based rendering framework for material and lighting decomposition from multi-view images, enabling editing, ray-tracing, and real-time relighting of the 3D point cloud. Specifically, a 3D scene is represented as a set of relightable 3D Gaussian points, where each point is additionally associated with a normal direction, BRDF parameters, and incident lights from different directions. To achieve robust lighting estimation, we further divide incident lights of each point into global and local components, as well as view-dependent visibilities. The 3D scene is optimized through the 3D Gaussian Splatting technique while BRDF and lighting are decomposed by physically-based differentiable rendering. Moreover, we introduce an innovative point-based ray-tracing approach based on the bounding volume hierarchy for efficient visibility baking, enabling real-time rendering and relighting of 3D Gaussian points with accurate shadow effects. Extensive experiments demonstrate improved BRDF estimation and novel view rendering results compared to state-of-the-art material estimation approaches. Our framework showcases the potential to revolutionize the mesh-based graphics pipeline with a relightable, traceable, and editable rendering pipeline solely based on point cloud.




Method Overview


The proposed differentiable rendering pipeline. Starting with a collection of 3D Gaussians that embody geometry, material, and lighting attributes, we first execute the rendering equation at Gaussian level to determine the outgoing radiance of a designated viewpoint. Following this, we proceed to render the corresponding features by employing rasterization coupled with alpha blending, thereby producing the vanilla color map, the physically based rendered color map, the depth map, the normal map, etc. To optimize relightable 3D Gaussians, we utilize the ground truth image and the pseudo normal map derived from the rendered depth map for supervision.






Novel View Synthesis


Qualitative results of novel view synthesis on NeRF synthetic dataset and DTU dataset compared with baselines




Material, Visibility and Normal


Qualitative results of BRDF estimation. Here we visualize the rendered average visibility (ambient occlusion) as well.




Visibility (Ambient Occlusion) through Ray Tracing


PBR Ambient Occlusion
PBR Ambient Occlusion
PBR Ambient Occlusion
PBR Ambient Occlusion



Geometry Enhancement


Ours
3DGS


Multi-object Composition and Relighting


Base Color
Normal
Ambient Occlusion
Render
Relighting
Rotating Light


Scene Composition and Relighting






Acknowledgements: The website template was borrowed from Lior Yariv. Image sliders are based on dics.