A domain-specific ML library, generally used for training NNUE-style networks for some of the strongest chess engines in the world.
- bullet_core
- An ML framework that is generic over backends:
- A network graph is constructed using
GraphBuilder
- This is then lowered to
GraphIR
and optimisation passes performed on it - The
GraphIR
is then compiled into aGraph<D: Device>
, for a specific backend device- Upon which forwards and backwards passes, editing weights/inputs, etc, may be performed
- A small set of (composable) optimisers are included that ingest a graph and provide update methods for it
- A token single-threaded CPU backend is included for verifying correctness of the crate and other backend implementations
- bullet_hip_backend
- Currently contains both the HIP (for AMD GPUs) and CUDA backends. Enable the
hip
feature to use the HIP backend.
- Currently contains both the HIP (for AMD GPUs) and CUDA backends. Enable the
- bullet_lib
- Provides a high-level wrapper around the above crates specifically for training networks to do with chess (and other games e.g. Ataxx) easily.
- bullet-utils
- Various utilities mostly to do with handling data
Before attempting to use, check out the docs. They contain all the main information about building bullet, managing training data and the network output format.
Most people simply clone the repo and edit one of the examples to their taste.
If you want to create your own example file to ease pulling from upstream, you need to add the example to bullet_lib
's Cargo.toml
.
Alternatively, import the bullet_lib
crate with
bullet = { git = "https://github.com/jw1912/bullet", package = "bullet_lib" }
Specific API documentation is covered by Rust's docstrings.
- Please open an issue to file any bug reports/feature requests.
- Feel free to use the dedicated
#bullet
channel in the Engine Programming discord server if you run into any issues. - For general training discussion the Engine Programming non-
#bullet
channels are appropriate, or#engines-dev
in the Stockfish discord.