Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Azure OpenAI support #276

Closed
vladiliescu opened this issue Jan 20, 2025 · 6 comments
Closed

Add Azure OpenAI support #276

vladiliescu opened this issue Jan 20, 2025 · 6 comments

Comments

@vladiliescu
Copy link
Contributor

vladiliescu commented Jan 20, 2025

I'd very much love to have native Azure OpenAI support for smolagents.

I understand that writing it is easy, but you may not want the burden of maintaining it indefinitely and you're trying to keep the surface area smol until the featureset becomes more complete and things are fleshed out. I've seen PR #161 .

I'd like to propose a simpler approach. Since smolagents already has an OpenAIServerModel implementation, this means we can subclass it and get all its current and future functionality for free, all we need to do is replace the openai client with the Azure flavor. So the burden for maintaining this class in the future is dramatically decreased.

Would the implementation beow be acceptable for you guys? If yes, I can send a PR. If not, I understand the reasons :).

class AzureOpenAIServerModel(OpenAIServerModel):
    """This model connects to an Azure OpenAI deployment.

    Parameters:
        model_id (`str`):
            The model identifier to use on the server (e.g. "gpt-3.5-turbo").
        azure_endpoint (`str`, *optional*):
            The Azure endpoint, including the resource, e.g. `https://example-resource.azure.openai.com/`
        api_key (`str`, *optional*):
            The API key to use for authentication.
        custom_role_conversions (`Dict{str, str]`, *optional*):
            Custom role conversion mapping to convert message roles in others.
            Useful for specific models that do not support specific message roles like "system".
        **kwargs:
            Additional keyword arguments to pass to the Azure OpenAI API.
    """

    def __init__(
        self,
        model_id: str,
        azure_endpoint: Optional[str] = None,
        api_key: Optional[str] = None,
        api_version: Optional[str] = None,
        custom_role_conversions: Optional[Dict[str, str]] = None,
        **kwargs,
    ):
        super().__init__(model_id=model_id, api_key=api_key, custom_role_conversions=custom_role_conversions, **kwargs)
        # if we've reached this point, it means the openai package is available (baseclass check) so go ahead and import it
        import openai

        self.client = openai.AzureOpenAI(
            api_key=api_key,
            api_version=api_version,
            azure_endpoint=azure_endpoint
        )
@aymeric-roucher
Copy link
Collaborator

Hey that's a much better version! Thanks a lot for proposing this implementation: this would work, please go ahead and open a PR 🚀
Also take care of adding this in the docs guided tour under a new option. (check the raw code to understand how hfoption tags work)

@vladiliescu
Copy link
Contributor Author

Will do, thanks!

@matthewcarbone
Copy link

@vladiliescu just wanted to say that this worked seamlessly for me, thank you!

@vladiliescu
Copy link
Contributor Author

Great to hear this @matthewcarbone !

@aymeric-roucher
Copy link
Collaborator

Closing this since the PR is merged, thank you @vladiliescu !

@vladiliescu
Copy link
Contributor Author

Perfect, thank you as well @aymeric-roucher !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants