This is the code release for VQ-NeRF Home Page.
Our code is mainly built upon the following projects. We sincerely thank the authors:
Also, we would like to thank all the collaborators who have helped with this project.
Clone this repository and rename it:
git clone https://github.com/JiuTongBro/vqnerf_release.git
mv vqnerf_release vqnfr_pro_release
cd vqnfr_pro_release
Our method requires a two-stage running. The first stage is for the geometry reconstruction, and the second stage is for the decomposition and segmentation.
- The codes for geometry reconstruction are under the
geo/
folder. It is an edited version of NeuS - The codes for decomposition and segmentation are under the
decomp/
folder. It is modified based on NeRFactor. This is the main part of our codes.
Download the data from this link. Then place all the items under the data/
folder.
There are five types of datasets:
nerf
dataset: It is a CG dataset, containing five NeRF-Blender scenes. The GT images are stored indata/nfr_blender
, the GT albedos and relighted images are stored indata/vis_comps
, the GT segmentation labels are stored indata/nerf_seg1
.mat
dataset: It is a CG dataset collected by us, containing three scenes with GTs for all BRDF attributes. It is stored indata/mat_blender
.dtu
dataset: It is a real dataset, containing three scenes collected from the NeuS-DTU dataset. It is stored indata/dtu_split2
.ours
dataset: It is a real dataset collected by us, containing three scenes. It is stored indata/colmap_split
.hw
dataset: It is a real dataset collected by us, containing four scenes. It is stored indata/1115_hw_data/1115data_1
.
The coordinate system for nerf
, mat
and hw
dataset follows NeRF-Blender, while the coordinate system for dtu
and ours
dataset follows NeuS-DTU.
The data/test_envs
stores a total of 16 envrionment maps for relighting. Eight of them are released by the nvdiffrec, and the other eight are collected by us. For some types of the datasets, we flipped those illumination, as the 'upper' direction is reversed in their coordinates.
The pretrained weights can be found here. The exp/
folder contains the weights for geometry reconstruction, and the output/
folder contains the weights for decomposition and segmentation.
Go to the geo/
folder, prepare and activate the environment.
cd geo
conda create --prefix="./geo_env" python=3.6
conda activate ./geo_env
pip install -r NeuS-ours2/requirements.txt
# You may need to manually update the torch-1.8.0 according to your cuda version
pip install tensorboard
# To make sure the correct version is installed
pip install pyhocon==0.3.57
Go to the code folder and link the data from the project root:
cd NeuS-ours2
ln -s <project_root>/data ./
Then follow the instructions in geo/NeuS-ours2.
Go to the decomp/
folder, prepare and activate the environment.
cd decomp
conda create --prefix="./decomp_env" python=3.6
conda activate ./decomp_env
pip install -r nerfvq_nfr3/requirements.txt # may need some manual adjustments, like the torch-cuda correspondence
pip install opencv-python==4.5.4.60
- In some cases, the environment installation of the
decomp/
may lead to an issue relevant to CUDA 12:
Could not load library libcublasLt.so.12. Error: libcublasLt.so.12: cannot open shared object file: No such file or directory
Aborted (core dumped)
This suggests that some libs are missed in your CUDA. To avoid mess up your CUDA env, you can manually download and place them, following this link for solution.
- We found that multiple factors of the running environment (e.g. tf-cuda-cudnn versions) could affect the floating point error in tf-gpu. And the accumulated error may slightly influence the reproducibility of the experiments. To reproduce our results stably, you may download our pretrained weights.
(Optional) Check the environment:
cd nerfvq_nfr3
python check_env.py # Optional, check the environments
cd ../
Then, link the data and the extracted geometry:
ln -s <project_root>/data ./
ln -s <project_root>/geo/NeuS-ours2/surf/* ./nerfvq_nfr3/output/
cd nerfvq_nfr3
Then follow the instructions in decomp/nerfvq_nfr3.
- If there is any problem, please open an issue. We will try to assist if we find time.
This website is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.