Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add back streaming for base o3 #8361

Merged
merged 1 commit into from
Feb 8, 2025

Conversation

mbosc
Copy link
Contributor

@mbosc mbosc commented Feb 7, 2025

No description provided.

Copy link

vercel bot commented Feb 7, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 7, 2025 6:10pm

@mbosc
Copy link
Contributor Author

mbosc commented Feb 7, 2025

Addresses issue #8360

if (
model and "o3" in model
): # o3 models support streaming - https://github.com/BerriAI/litellm/issues/8274
return False
supported_stream_models = ["o1-mini", "o1-preview"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you just need to add it here, instead of another if block @mbosc

Copy link
Contributor Author

@mbosc mbosc Feb 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you want me to add the model to the list? I did it like this because I saw that the AzureOpenAI version of o_series_transformation uses this block, so I wanted to keep the code aligned.
https://github.com/BerriAI/litellm/blob/16be203283b25f1116c6acc9144b19711b10e909/litellm/llms/azure/chat/o_series_transformation.py#L37C1-L41C25

If you prefer I'll just add o3-mini and o3-mini-2025-01-31 to the supported_stream_models list at line 61.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure let's start there

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we can update the check to see if the substring o1-preview/o3-mini etc. is in the model name.

It should ensure it works for future versions

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll rebase and re-commit

@krrishdholakia krrishdholakia changed the base branch from main to litellm_dev_02_07_2025_p3 February 8, 2025 01:11
@krrishdholakia krrishdholakia merged commit 31ee977 into BerriAI:litellm_dev_02_07_2025_p3 Feb 8, 2025
1 check passed
krrishdholakia added a commit that referenced this pull request Feb 8, 2025
* add back streaming for base o3 (#8361)

* test(base_llm_unit_tests.py): add base test for o-series models - ensure streaming always works

* fix(base_llm_unit_tests.py): fix test for o series models

* refactor: move test

---------

Co-authored-by: Matteo Boschini <12133566+mbosc@users.noreply.github.com>
abhijitherekar pushed a commit to acuvity/litellm that referenced this pull request Feb 20, 2025
* add back streaming for base o3 (BerriAI#8361)

* test(base_llm_unit_tests.py): add base test for o-series models - ensure streaming always works

* fix(base_llm_unit_tests.py): fix test for o series models

* refactor: move test

---------

Co-authored-by: Matteo Boschini <12133566+mbosc@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants