Dataset · Checkpoints · Website · Paper
Follow these instructions to install dependencies and create an environment.
git clone https://github.com/Dou-Yiming/TaRF.git
cd TaRF/nerfstudio_modules
python -m pip install -e .
Run ns-train -h
: you should see a list of "subcommands" with tarf included among them.
conda create --name ldm -y python=3.8
cd TaRF/img2touch
python -m pip install -e .
Download the COLMAP databases (including images and camera poses of 13 scenes) from this link, then extract them:
cd TaRF/nerfstudio_modules
mkdir data
cd data
tar -xvf {dir of the downloaded colmap folder}/*.tar.gz ./
Download the pretrained NeRF models from this link, then extract them:
cd TaRF/nerfstudio_modules
mkdir outputs
cd outputs
tar -xvf {dir of the downloaded NeRF folder}/*.tar.gz ./
Download the pretrained Diffusion models from this link, then extract them:
tar -xvf {dir of the downloaded Diffusion folder}/pretrained_models.tar.gz ./
mv pretrained_models TaRF/img2touch
mv TaRF/img2touch/pretrained_models/first_stage_model.ckpt TaRF/img2touch/models/first_stage_models/kl-f8/model.ckpt
First, pick a scene from {conference_room, clothes, office_1, stairs, table_outdoor, workroom, bench, snow, bus_stop, office_2, chair_outdoor, table_indoor lounge}
Next, launch the Nerfstudio viewer:
cd TaRF/nerfstudio_modules
ns-viewer --load-config ./outputs/{scene}_colmap/tarf/test/config.yml --vis viewer --viewer.max-num-display-images 64
This will provide you with a link to the Nerfstudio viewer.
Now you may interact with the TaRF in the browser with the following steps:
- Click the "Touch on Scene" button at the bottom-right corner.
- Click the point you want to touch in the TaRF.
This will give you the egocentric RGBD signal of the clicked point, saved in TaRF/nerfstudio_modules/outputs/touch_estimation_input_cache
.
First, change the bg_path
argument in TaRF/img2touch/scripts/bash_scripts/run_touch_estimator_real_time.sh
to touch_bg/{scene}_colmap_40_50/bg.jpg
.
Next, launch real-time touch estimator on another GPU:
cd TaRF/img2touch
bash scripts/bash_scripts/run_touch_estimator_real_time.sh
The tactile signals can now be estimated in real-time whenever a new point is clicked, the results will be saved at TaRF/img2touch/outputs/touch_estimator_real_time/best.png
.
cd TaRF/img2touch
bash scripts/bash_scripts/train_touch_estimator.sh
Coming soon!
If you find TaRF useful, please consider citing:
@inproceedings{dou2024tactile,
title={Tactile-augmented radiance fields},
author={Dou, Yiming and Yang, Fengyu and Liu, Yi and Loquercio, Antonio and Owens, Andrew},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={26529--26539},
year={2024}
}