As part of the master’s thesis, we relate to the research work of Seidel et al. [1] and extend the comparison to more modern architectures of neural networks that can directly or indirectly process 3D data or point clouds. For a neural network that indirectly processes 3D data, we opted for a point neural network or PointNeXt. For a neural network that directly processes 3D data, we opted for the conventional convolutional neural network or EfficientNetV2.
Due to the complications in the preparation of the selected two neural networks and the novelties of the field, we decided to publish the simplified implementation together with the results of the analysis and data preprocessing. In addition to the implementation, there will also be an open-access master’s thesis in Slovenian, where we have written a little more about neural networks, data on tree species, training neural networks and final findings. If our final work was of help, we will be happy of a reference. 😊
In our implementation we use pytorch3d version 0.7.5, which requires Python version from 3.8 up to 3.10. You could use a newer version of Python, but it will require some additional libraries to compile, like cuda-toolkit.
First, make sure you have installed one of the latest drivers; otherwise you can have problems with GPU availability with the PyTorch library. You should pay attention to the CUDA version of your installed driver. Your GPU's CUDA version must be equal to or greater than the CUDA version required by PyTorch. We strongly recommend using a GPU with the given code. You can check your version with this command:
nvidia-smi
For PyTorch follow the instructions here https://pytorch.org/get-started/locally/. If you are planning to build pythorch3d library for an unsupported version of Python, make sure you download the same version of Cuda Toolkit as your installed PyTorch library supports. The Cuda toolkit can be downloaded here https://developer.nvidia.com/cuda-toolkit-archive/.
Before we can install pytorch3d we need to install some additional libraries. Here we will install all necessary libraries for our code:
pip install torchinfo scipy seaborn scikit-learn pandas matplotlib k3d ipywidgets watermark ipykernel pyproject.toml wheel
For pytorch3d there are many ways to install it. We went with the one that is supported for Windows. For other ways of installation, you can check here https://github.com/facebookresearch/pytorch3d?tab=readme-ov-file#installation.
If you are installing it under Windows, you will also need Git, can be accessed here https://git-scm.com/download/win, and C++ Build Tools, which can be accessed here https://visualstudio.microsoft.com/visual-cpp-build-tools/. In C++ Build Tools, make sure to check and install "Desktop development with C++". You might need to reboot or close the current terminal in order for Git's path to be updated.
After that we can install pytorch3d from git:
pip install "git+https://github.com/facebookresearch/pytorch3d.git@stable"
Building pytorch3d with the upper command took around 8 minutes to complete on Ryzen 7.
We would like to express our gratitude to the authors of the neural networks that were used in our master's thesis and to the people who implemented them before us. It made our implementation easier when there was not a lot of information available from official sources.
-
PointNeXt: Revisiting PointNet++ with Improved Training and Scaling Strategies
-
Unofficial implementations of PointNeXt:
-
Unofficial implementations of EfficientNetV2:
We would also like to express gratitude to the authors of publicly available datasets since finding one that is publicly available can be a challenge.
-
Boni Vicari et al.: Leaf and wood classification framework for terrestrial LiDAR point clouds: Simulated data validation dataset.
-
Owen et al.: Individual TLS tree clouds collected from both Alto Tajo and Cuellar in Spain.
-
Seidel: Single tree point clouds from terrestrial laser scanning.
Used dataset can be accessed from here or here. The download size is about 3.4 GiB, with a final size of up to 12 GiB.
Master's thesis and full reference can be accessed here.