Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add anthropic native support #5695

Merged
merged 16 commits into from
Feb 26, 2025
Merged

add anthropic native support #5695

merged 16 commits into from
Feb 26, 2025

Conversation

victordibia
Copy link
Collaborator

@victordibia victordibia commented Feb 25, 2025

Claude 3.7 just came out. Its a pretty capable model and it would be great to support it in Autogen.
This will could augment the already excellent support we have for Anthropic via the SKAdapters in the following ways

  • Based on the ChatCompletion API similar to the ollama and openai client
  • Configurable/serializable (can be dumped) .. this means it can be used easily in AGS.

What is Supported

(video below shows the client being used in autogen studio)
https://github.com/user-attachments/assets/8fb7c17c-9f9c-4525-aa9c-f256aad0f40b

  • streaming
  • tool callign / function calling
  • drop in integration with assistant agent.
  • multimodal support
from dotenv import load_dotenv
import os 

load_dotenv()

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.models.anthropic import AnthropicChatCompletionClient 
model_client =   AnthropicChatCompletionClient(
        model="claude-3-7-sonnet-20250219" 
    )

async def get_weather(city: str) -> str:
    """Get the weather for a given city."""
    return f"The weather in {city} is 73 degrees and Sunny."

 
agent = AssistantAgent(
    name="weather_agent",
    model_client=model_client,
    tools=[get_weather],
    system_message="You are a helpful assistant.", 
    # model_client_stream=True,   
)

# Run the agent and stream the messages to the console.
async def main() -> None:
    await Console(agent.run_stream(task="What is the weather in New York?"))
await main()

result

messages = [
    UserMessage(content="Write a very short story about a dragon.", source="user"),
]

# Create a stream.
stream = model_client.create_stream(messages=messages)

# Iterate over the stream and print the responses.
print("Streamed responses:")
async for response in stream:  # type: ignore
    if isinstance(response, str):
        # A partial response is a string.
        print(response, flush=True, end="")
    else:
        # The last response is a CreateResult object with the complete message.
        print("\n\n------------\n")
        print("The complete response:", flush=True)
        print(response.content, flush=True)
        print("\n\n------------\n")
        print("The token usage was:", flush=True)
        print(response.usage, flush=True)

Why are these changes needed?

Related issue number

Closes #5205
Closes #5708

Checks

cc @rohanthacker

Copy link

codecov bot commented Feb 25, 2025

Codecov Report

Attention: Patch coverage is 24.03259% with 373 lines in your changes missing coverage. Please review.

Project coverage is 73.47%. Comparing base (1f30622) to head (70e2250).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
.../autogen_ext/models/anthropic/_anthropic_client.py 14.45% 361 Missing ⚠️
...xt/src/autogen_ext/models/anthropic/_model_info.py 33.33% 12 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #5695      +/-   ##
==========================================
- Coverage   75.75%   73.47%   -2.29%     
==========================================
  Files         171      175       +4     
  Lines       10637    11128     +491     
==========================================
+ Hits         8058     8176     +118     
- Misses       2579     2952     +373     
Flag Coverage Δ
unittests 73.47% <24.03%> (-2.29%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@victordibia victordibia marked this pull request as ready for review February 25, 2025 21:42
<!-- Thank you for your contribution! Please review
https://microsoft.github.io/autogen/docs/Contribute before opening a
pull request. -->
Mostly just to ensure the UI provides the right elements to configure an
anthropic model.

<!-- Please add a reviewer to the assignee section when you create a PR.
If you don't have the access to it, we will shortly find a reviewer and
assign them to your PR. -->

## Why are these changes needed?

<!-- Please give a short summary of the change and the problem this
solves. -->

## Related issue number

<!-- For example: "Closes #1234" -->

## Checks

- [ ] I've included any doc changes needed for
<https://microsoft.github.io/autogen/>. See
<https://github.com/microsoft/autogen/blob/main/CONTRIBUTING.md> to
build and test documentation locally.
- [ ] I've added tests (if relevant) corresponding to the changes
introduced in this PR.
- [ ] I've made sure all auto checks have passed.
@victordibia victordibia enabled auto-merge (squash) February 26, 2025 07:23
@victordibia victordibia merged commit 05fc763 into main Feb 26, 2025
53 checks passed
@victordibia victordibia deleted the anthropic_client_vd branch February 26, 2025 07:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support Anthropic Client in AutoGen v0.4 Support for anthropic models in v0.4
2 participants