Skip to content

An end-to-end pipeline for generating and visualizing high-resolution mesh animations using neural fitting. Includes Blender scripts for data preparation, a custom neural network for animation fitting, and tools for post-processing outputs into point cloud visualizations in Meshlab.

Notifications You must be signed in to change notification settings

bkhalil3/DeepVertexAnimator

Repository files navigation

This project demonstrates a novel (but experimental) method for generating and interpolating 3D animation frames using a Feed-Forward Neural Network (FNN). The approach starts with a vertex colored and rigged 3D animation created in Blender. Animation frames are exported as PLY files, converted to CSV training data, and used to train a MLP/FNN network with an input of 0-totalframes and an output of a vertex buffer so that decimal inbetween frames can be requested from the network. The trained network generates interpolated vertex data within a deviance of ~0.003 of the original training data from a network that has 97,034 parameters (379.04 KB).

In short, a Feed-Forward Neural Network that generates and interpolates your 3D animation frames for you!


Features

  • Blender Integration: Scripts for exporting rigged animations as PLY files.
  • Data Preprocessing: Converts PLY files into CSV format suitable for neural network training.
  • Neural Network Training: Trains an FNN to interpolate vertex data for high-resolution animations.
  • Frame Interpolation: Generates decimal frames to provide smooth transitions between animation keyframes.
  • Visualization Tools: Outputs point cloud data for visual inspection in Meshlab.

Folder Structure

  • girl_ply/: Contains exported animation frames in PLY format.
  • girl_data/: Stores processed CSV training data for the neural network.
  • models/: Houses trained models and prediction data.

Architecture

  • Neural Network Architecture: Feed-Forward Neural Network (FNN).
  • Input: Normalized time values (e.g., 0 to total_frames).
  • Output: Vertex buffer representing the interpolated animation frame.
  • Model Parameters: 97,034 (379.04 KB).
  • Deviation: ±0.003 from the original training data.

Tips and Utilities

  • Point Cloud Visualization: Drag multiple .asc files into Meshlab to observe motion between frames.
  • Reset Workflow: Run reset.sh to delete all generated data and restart the pipeline.
  • Optimization: For performance gains, consider running the neural network on a CPU with FMA auto-vectorization (-mfma).

Steps

1. Export PLY Animation Frames

  1. Open girl_rig_exporter.blend.
  2. Run the script export_frames.
  3. The folder girl_ply will be created, containing each 3D animation frame in PLY format.

2. Convert PLY to CSV

  1. Open scripts.blend.
  2. Run the script ply_to_csv.
  3. The folder girl_data will be created, containing CSV files for training.

3. Train the Neural Network

  1. Run the following and girl_data will be used to train a network which will be output to the models directory.
    python3 fit.py

4. Generate and Visualize Interpolated Frames

  1. Navigate to the trained network output directory *_pd.
  2. Run the script CSVtoASC.sh inside of the *_pd directory.
    ./CSVtoASC.sh
  3. The ASC directory will now contain interpolated point cloud files (in .asc format) for every predicted frame.
  4. Load these into Meshlab to visualize the point clouds.

About

An end-to-end pipeline for generating and visualizing high-resolution mesh animations using neural fitting. Includes Blender scripts for data preparation, a custom neural network for animation fitting, and tools for post-processing outputs into point cloud visualizations in Meshlab.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published