llama
Here are 18 public repositories matching this topic...
Inference Llama3.2 1B/3B base/instruct models in 1 file of pure C
-
Updated
Dec 6, 2024 - C
Fast LLaMa inference on CPU using llama.cpp for Python
-
Updated
Mar 23, 2023 - C
V-lang api wrapper for llm-inference chatllm.cpp
-
Updated
Nov 20, 2024 - C
llama.cpp Desktop Client Demo
-
Updated
Apr 22, 2023 - C
iBuild is a desktop app that uses local AI models to generate Minecraft block data from text prompts, runs entirely locally—no external APIs needed. It combines Python handling the UI and model integration, and a custom C library managing Minecraft region file updates.
-
Updated
Feb 14, 2025 - C
Nim api-wrapper for llm-inference chatllm.cpp
-
Updated
Nov 20, 2024 - C
C++ Implementation of Meta's LLaMA v2 Engine. Credited to ggerganov/llama.cpp
-
Updated
Oct 11, 2023 - C
kotlin api wrapper for llm-inference chatllm.cpp
-
Updated
Nov 26, 2024 - C
Improve this page
Add a description, image, and links to the llama topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."