Welcome to the LLM Playground repository!🎉
I've documented my step-by-step experiments with LLMs, exploring various frameworks and platforms through practical implementations.
- This repository provides a hands-on collection of practical LLM applications, including chatbots, AI agents, SQL and RAG implementations.
- You'll find examples of model fine-tuning, vector databases, and various LLM integrations using tools like LangChain, CrewAI, and OpenAI more.
- The projects range from basic chatbots to advanced applications using AWS, NVIDIA, and custom agents for specific tasks.
- Practical implementations of LLMs and chatbots
- Integration examples with popular services:
- OpenAI & ChatGPT
- Groq (Gemma Models)
- HuggingFace Models
- Codellama & Gradio
- CrewAI Agents
- NVIDIA NIM
- AWS Bedrock & SageMaker Integration
- Advanced features like:
- Vector embeddings & FAISS
- RAG (Retrieval Augmented Generation)
- Hybrid Search
- Database integrations
- LCEL (LangChain Expression Language)
- LangGraph implementations
- Basic to advanced chatbot implementations
- Text summarization and Q&A systems
- Integration with various databases:
- SQL & SQLite
- Vector DBs (ChromaDB, FAISS, GraphDB, Pinecone)
- AstraDB
- Mathematical computing with LLMs
- Hybrid search implementations
- AWS and Cloud integrations
- Graph-based LLM applications
- Agent-based systems:
- Search Engine Agents
- RAG Paper QA
- Multi-agent conversations
- Advanced optimization:
- Fine-tuning methods
- Performance optimization
- app.py: Streamlit/Gradio application file
- notebook.ipynb: Detailed Jupyter notebook with explanations
- UI.png: Application screenshot/demo image
- requirements.txt: Project-specific dependencies
To explore these projects locally:
git clone https://github.com/Duygu-Jones/LLM_Playground.git
cd LLM_Playground
-
Environment Setup
- Create a
.env
file in the project root - Add your API keys:
GROQ_API_KEY= "your_groq_api_key" OPENAI_API_KEY= "your_openai_api_key"
- Create a
-
Dependencies Installation
- Global dependencies:
pip install -r requirements.txt
- Global dependencies:
-
Running Applications
- Each project can be run independently:
cd [project_folder] streamlit run app.py
- Open browser at http://localhost:...
- Each project can be run independently:
Contributions are welcome! If you have improvements or additional examples to share:
- Fork the repository
- Create your feature branch
- Submit a pull request
I'm Duygu Jones, a Machine Learning/AI Engineer passionate about LLMs and AI applications.
♻️ Connect with me:
- LinkedIn: Linkedin/duygujones
- Website: duygujones.com
- Kaggle: kaggle.com/duygujones
- GitHub: github.com/Duygu-Jones
- Medium: medium.com/@duygujones
💫 If you find this repository helpful, please give it a ⭐ star!
This repository is licensed under the MIT License. See the LICENSE file for details.