LLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages .gitignore
patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).
Note: This project was developed in collaboration with Claude-3.5-Sonnet, using LLM Context itself to share code during development. All code in the repository is human-curated (by me 😇, @restlessronin).
For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: LLM Context: Harnessing Vanilla AI Chats for Development
- Direct LLM Integration: Native integration with Claude Desktop via MCP protocol
- Chat Interface Support: Works with any LLM chat interface via CLI/clipboard
- Optimized for interfaces with persistent context like Claude Projects and Custom GPTs
- Works equally well with standard chat interfaces
- Project Types: Suitable for code repositories and collections of text/markdown/html documents
- Project Size: Optimized for projects that fit within an LLM's context window. Large project support is in development
Install LLM Context using uv:
uv tool install llm-context
To upgrade to the latest version:
uv tool upgrade llm-context
Warning: LLM Context is under active development. Updates may overwrite configuration files prefixed with
lc-
. We recommend backing up any customized files before updating.
Add to 'claude_desktop_config.json':
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
Once configured, you can start working with your project in two simple ways:
-
Say: "I would like to work with my project" Claude will ask you for the project root path.
-
Or directly specify: "I would like to work with my project /path/to/your/project" Claude will automatically load the project context.
- Navigate to your project's root directory
- Initialize repository:
lc-init
(only needed once) - (Optional) Edit
.llm-context/config.toml
to customize ignore patterns - Select files:
lc-sel-files
- (Optional) Review selected files in
.llm-context/curr_ctx.toml
- Generate context:
lc-context
- Use with your preferred interface:
- Project Knowledge (Claude Pro): Paste into knowledge section
- GPT Knowledge (Custom GPTs): Paste into knowledge section
- Regular chats: Use
lc-set-profile code-prompt
first to include instructions
- When the LLM requests additional files:
- Copy the file list from the LLM
- Run
lc-read-cliplist
- Paste the contents back to the LLM
lc-init
: Initialize project configurationlc-set-profile <name>
: Switch profileslc-sel-files
: Select files for inclusionlc-context
: Generate and copy contextlc-read-cliplist
: Process LLM file requests
LLM Context provides advanced features for customizing how project content is captured and presented:
- Smart file selection using
.gitignore
patterns - Multiple profiles for different use cases
- Code outline generation for supported languages
- Customizable templates and prompts
See our User Guide for detailed documentation of these features.
Check out our comprehensive list of alternatives - the sheer number of tools tackling this problem demonstrates its importance to the developer community.
LLM Context evolves from a lineage of AI-assisted development tools:
- This project succeeds LLM Code Highlighter, a TypeScript library I developed for IDE integration.
- The concept originated from my work on RubberDuck and continued with later contributions to Continue.
- LLM Code Highlighter was heavily inspired by Aider Chat. I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.
- This project uses tree-sitter tag query files from Aider Chat.
- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.
I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.
I am grateful for the help of Claude-3.5-Sonnet in the development of this project.
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.