Gaussian Opacity Fields


Efficient and Compact Surface Reconstruction

in Unbounded Scenes


Zehao Yu1,2     Torsten Sattler3      Andreas Geiger1,2
1University of Tübingen     2Tübingen AI Center     3Czech Technical University in Prague

TL;DR: Gaussian Opacity Fields (GOF) enables geometry extraction with 3D Gaussians directly by indentifying its level set. Our regularization improves surface reconstruction and we utilize Marching Tetrahedra for compact and scene adaptive mesh extraction.

Abstract

Recently, 3D Gaussian Splatting (3DGS) has demonstrated impressive novel view synthesis results, while allowing the rendering of high-resolution images in real-time. However, leveraging 3D Gaussians for surface reconstruction poses significant challenges due to the explicit and disconnected nature of 3D Gaussians. In this work, we present Gaussian opacity field (GOF), a novel approach for efficient, high-quality, and compact surface reconstruction in unbounded scenes. Our GOF is derived from ray-tracing-based volume rendering of 3D Gaussians, enabling direct geometry extraction from 3D Gaussians by identifying its levelset, without resorting to Poisson reconstruction or TSDF fusion as in previous work. We approximate the surface normal of Gaussians as the normal of the ray-Gaussian intersection plane, enabling the application of regularization that significantly enhances geometry. Furthermore, we develop an efficient geometry extraction method utilizing marching tetrahedra, where the tetrahedral grids are induced from 3D Gaussians and thus adapt to the scene's complexity. Our evaluations reveal that GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis. Further, it compares favorably to, or even outperforms, neural implicit methods in both quality and speed.


Reconstructions on Mip-NeRF 360 Dataset

Reconstructions on the Tanks and Temples Dataset

Comparisons

Comparison wtih 2DGS

2DGS fails to reconstruct background geometry, while our method can reconstruct more detailed geometry for both foreground objects and backgrounds.

Comparison wtih SuGaR

Compared with SuGaR, our method can reconstruct more detailed and smooth geometry for both foreground objects and backgrounds.

BibTeX

@article{Yu2024GOF,
  author    = {Yu, Zehao and Sattler, Torsten and Geiger, Andreas},
  title     = {Gaussian Opacity Fields: Efficient High-quality Compact Surface Reconstruction in Unbounded Scenes},
  journal   = {arXiv preprint arXiv:2404.10772},
  year      = {2024},
}

Acknowledgements

We thank Christian Reiser for insightful discussions and valuable feedback throughout the project. We also thank Binbin Huang for proofreading. ZY and AG are supported by the ERC Starting Grant LEGO-3D (850533) and DFG EXC number 2064/1 - project number 390727645. TS is supported by a Czech Science Foundation (GACR) EXPRO grant (UNI-3D, grant no. 23-07973X).

References