Skip to content

The LLM CLI Toolkit is a sophisticated command-line interface that provides seamless integration with multiple AI providers through a unified, elegant interface.

Notifications You must be signed in to change notification settings

Traves-Theberge/LLM-CLI-TOOLKIT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

5467b291-becc-4208-8770-e84333c79903

License: MIT Node.js Version PRs Welcome Maintained Discord

A powerfully elegant CLI for unified access to multiple AI providers

FeaturesInstallationUsageConfigurationContributing

🎯 Overview

The LLM CLI Toolkit is a sophisticated command-line interface that provides seamless integration with multiple AI providers through a unified, elegant interface. Built with modern JavaScript and featuring a stunning powerline-style terminal UI, OFCA makes interacting with various AI models as simple as a single function call.

✨ Features

🤖 Supported Providers

  • OpenAI - GPT-4, GPT-3.5-Turbo
  • Anthropic - Claude 3 Opus, Sonnet, Haiku
  • OpenRouter - Multiple model aggregation
  • Groq - Ultra-fast inference
  • Grok - X's AI powerhouse
  • Mistral - Advanced open models

🎨 UI Features

  • Powerline Style - Modern terminal aesthetics
  • Provider Themes - Unique gradients per provider
  • Smart Icons - Nerd Font integration
  • Responsive Design - Adapts to terminal size

🛠 Technical Features

  • Async Architecture - Non-blocking operations
  • Error Resilience - Comprehensive error handling
  • Demo Mode - Try without API keys
  • Smart Caching - Optimized responses

📦 Installation

# Clone the repository
git clone https://github.com/traves-theberge/OFCA.git

# Install dependencies
cd OFCA
npm install

# Set up environment variables
cp .env.example .env

🚀 Usage

# Start in normal mode
npm start

# Start in demo mode
npm run demo

⚙️ Configuration

Create a .env file with your API keys:

OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
OPENROUTER_API_KEY=sk-or-...
GROQ_API_KEY=gsk-...
GROK_API_KEY=grok-...
MISTRAL_API_KEY=...

🎮 Interactive Commands

During chat sessions, you can use these special commands:

Command Description
/switch Change provider or model
/model List available models
/help Show command list
/clear Clear chat history
/exit End session

🎨 Themes and Customization

OFCA features unique themes for each provider:

Provider Theme Colors Icon
OpenAI Green gradient 🤖
Anthropic Purple gradient 🌟
OpenRouter Orange gradient 🌐
Groq Blue gradient
Grok Red gradient 🤘
Mistral Indigo gradient 🌪️

🔧 Advanced Configuration

Custom Model Settings

{
  "temperature": 0.7,
  "maxTokens": 2000,
  "topP": 0.9,
  "frequencyPenalty": 0.0,
  "presencePenalty": 0.0
}

Rate Limiting

RATE_LIMIT_REQUESTS=60
RATE_LIMIT_WINDOW=60000

🔍 Troubleshooting

Common Issues

API Key Problems

# Verify API keys
npm run check-env

# Test specific provider
npm run test-provider openai

Connection Issues

  • Check internet connection
  • Verify API endpoint status
  • Ensure proper proxy configuration

Model Availability

  • Confirm model access level
  • Check provider status page
  • Try fallback models

📊 Performance Optimization

Caching Strategy

  • Response caching
  • Model preloading
  • Connection pooling

Memory Management

  • Automatic garbage collection
  • Stream processing
  • Buffer optimization

🔒 Security Best Practices

  1. API Key Management

    • Use environment variables
    • Rotate keys regularly
    • Never commit keys to repo
  2. Data Protection

    • Local message encryption
    • Secure storage
    • Privacy controls
  3. Access Control

    • Rate limiting
    • IP restrictions
    • Usage monitoring

📚 Documentation

Detailed documentation is available in the Wiki.

🤝 Contributing

Development Setup

  1. Fork and Clone

    git clone https://github.com/your-username/OFCA.git
  2. Install Dependencies

    npm install
  3. Create Branch

    git checkout -b feature/your-feature

Coding Standards

  • ESLint configuration
  • Prettier formatting
  • JSDoc comments
  • Unit test coverage

Pull Request Process

  1. Update documentation
  2. Add unit tests
  3. Follow commit conventions
  4. Request review

👥 Contributors

Traves Theberge
Traves Theberge

Project Lead

🙏 Acknowledgments

Special thanks to Echo Hive for their invaluable support and contributions to this project.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

Footer

About

The LLM CLI Toolkit is a sophisticated command-line interface that provides seamless integration with multiple AI providers through a unified, elegant interface.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published