-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve openai server, add more examples of model with LiteLLM #182
Conversation
@aymeric-roucher Hi. Please let me know if there is any problem with this PR. Should I close if it not needed. |
**kwargs: | ||
Additional keyword arguments to pass to the OpenAI API. | ||
Additional optional arguments that will be passed directly to the | ||
OpenAI API client, allowing for further customization. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now we don't need any other parameters in the OpenAIServerModel than api_key and api_base! organization
, project
can be passed as kwargs by the user and temperature
is already handled at the Model
level
Thank you @duydl , sorry for the delay in getting back, this needed to wait for a refacto of parameters. Now we don't need additional parameters, when that's removed we can merge this PR! 😃 |
@duydl following up on the above, when you get a chace, please remove the additional parameters introduced and then we can merge the documentation improvements! 🤗 |
I tried to add Azure API but realized it is supported with LiteLLM.
OpenAI model also are support by both LiteLLM and OpenAIServer class. For OpenAIServer class api_key and api_base are should be optional as they could be replaced with usage of env var.