Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix passing top_k parameter for Bedrock Anthropic models #8131

Conversation

vibhavbhat
Copy link
Contributor

@vibhavbhat vibhavbhat commented Jan 31, 2025

Title

Fix passing top_k parameter for Bedrock Anthropic models

Relevant issues

Fixes #7782

Type

πŸ› Bug Fix
βœ… Test

Changes

The bug arose from the fact that different bedrock models pass in the top_k parameter in different ways.

Specifically the nova model passes in the parameter through

additionalModelRequestFields = {
    "inferenceConfig": {
         "topK": 20
    }
}

and the anthropic model passes it in through

additional_model_fields = {"top_k": top_k}

This PR checks the model types and sets the parameter based on that. Right now, this is a simple if statement, but a long term fix might be to create a new class for each model, and create handling logic for supported / model-specific param within each class. This way the overall converse handler does not need to know about these specifics.

This PR also simplifies the additional_model_fields creation logic

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

Tested the top_k param for 4 different models w/ real API calls:
image

Tests still passed w/ mocks:
image

Copy link

vercel bot commented Jan 31, 2025

The latest updates on your projects. Learn more about Vercel for Git β†—οΈŽ

Name Status Preview Comments Updated (UTC)
litellm βœ… Ready (Inspect) Visit Preview πŸ’¬ Add feedback Feb 4, 2025 4:36am

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM overall, just a couple minor points of feedback

for k, v in inference_params.items():
if (
k not in supported_converse_params
and k not in supported_tool_call_params
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this deletion required ? I'm pretty sure we need to filter our specific guardrail params

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just wanted to simplify the code a bit (using list comprehension instead of adding and then popping from the dict). From my understanding the before and after logic should be functionally equivalent, but if I'm wrong, I can revert it

"bedrock/mistral.mistral-7b-instruct-v0:2",
]
)
def test_bedrock_top_k(model):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you send a screenshot of this test working for all the API calls here ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think he did

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I have it added it in the description, let me know if you need more screenshots

@krrishdholakia
Copy link
Contributor

Is this okay to merge? @ishaan-jaff

@vibhavbhat
Copy link
Contributor Author

@ishaan-jaff Just following up to see if this is fine to merge

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, merging into staging branch to run through testing

tests/llm_translation/test_bedrock_completion.py Outdated Show resolved Hide resolved
@ishaan-jaff ishaan-jaff changed the base branch from main to litellm_contributor_feb3 February 4, 2025 06:10
@ishaan-jaff ishaan-jaff merged commit 24260f0 into BerriAI:litellm_contributor_feb3 Feb 4, 2025
2 checks passed
ishaan-jaff added a commit that referenced this pull request Feb 5, 2025
* Fix Bedrock Anthropic topK bug

* Remove extra import

* Add unit test + make tests mocked

* Fix camel case

* Fix tests to remove exception handling

Co-authored-by: vibhavbhat <vibhavb00@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Cannot pass provider-specific parameters to Bedrock Anthropic models
4 participants