RaDe-GS: Rasterizing Depth in Gaussian Splatting

Baowen Zhang1
Chuan Fang1
Rakesh Shrestha2



We present a rasterized method to compute the depth and surface normal maps of general Gaussian splats. Our method achieves high-quality 3D shape reconstruction and maintains excellent training and rendering efficiency.



Abstract

Gaussian Splatting (GS) has proven to be highly effective in novel view synthesis, achieving high-quality and real-time rendering. However, its potential for reconstructing detailed 3D shapes has not been fully explored. Existing methods often suffer from limited shape accuracy due to the discrete and unstructured nature of Gaussian splats, which complicates the shape extraction. While recent techniques like 2D GS have attempted to improve shape reconstruction, they often reformulate the Gaussian primitives in ways that reduce both rendering quality and computational efficiency. To address these problems, our work introduces a rasterized approach to render the depth maps and surface normal maps of general 3D Gaussian splats. Our method not only significantly enhances shape reconstruction accuracy but also maintains the computational efficiency intrinsic to Gaussian Splatting. Our approach achieves a Chamfer distance error comparable to NeuraLangelo on the DTU dataset and similar training and rendering time as traditional Gaussian Splatting on the Tanks & Temples dataset. Our method is a significant advancement in Gaussian Splatting and can be directly integrated into existing Gaussian Splatting-based methods.




Paper and Code

B. Zhang, C. Fang, R. Shrestha, Y. Liang, X. Long, P. Tan

RaDe-GS: Rasterizing Depth in Gaussian Splatting.

Arxiv, 2024.

[Paper] [Code] [Bibtex]



Results

Visual comparison of our method and the previous Gaussian-based methods on the DTU dataset.

Comparison of our method previsous GS-based method on novel view synthesis.

Erratum

In the paper[v1], we mistakenly run the '2D GS' code on the Mip-NeRF 360 dataset and the Synthetic NeRF dataset, causing mistakes in figure 1, figure 4, figure 8, figure 10 and table 4. We have updated the results in the latest version.




Acknowledgements

The websiteis modified from this template.