-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support multiple LLMs through Litellm #222
Comments
Need to think about how to select the correct provider, model and key |
/bounty 10 |
💎 $50 bounty • potpie.aiSteps to solve:
Thank you for contributing to potpie-ai/potpie! Add a bounty • Share on socials
|
/bounty 50 |
/attempt #222 Options |
1 similar comment
/attempt #222 Options |
@dhirenmathur does we have to support all llm with litellm or the llm which are present in current application for those just we have to change from langchain to lite |
and also does i have to completely remove langchain in all aspects like - prompttemplates , tool calls or just the chat |
hey @PRANJALRANA11 we want to incorporate litellm across the app so that we can support all LLMs. We use instances of langchain client and the crewai client, this makes the provider_service and agent code really messy, ideally we want to replace both with a single litellm instance. Yes if we are not going to use the langchain client then it does not make sense to keep the prompt template etc. For tools, I think we can continue with langchain tools, what do you think? |
Hey @dfordp , Thanks for attempting this! In the interest of time, I'm assigning this to @PRANJALRANA11 since there have been no follow up messages in ~10 days. We will have other open issues for you to contribute on when you're ready! |
@dhirenmathur sorry I wasn't able to do it, I was 50% through to doing it but got busy with work. |
Replace LangChain Chat Clients with LiteLLM
Objective
Replace all LangChain chat client implementations with LiteLLM while maintaining existing functionality.
APIs to manage user's LLM choice and keys and ensure that right LLM is selected during processing and agent execution.
Requirements
Implementation
Testing
Success Criteria
The text was updated successfully, but these errors were encountered: