Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llms/anthropic: adds full support for messages api #707

Merged
merged 7 commits into from
Mar 21, 2024
Merged

llms/anthropic: adds full support for messages api #707

merged 7 commits into from
Mar 21, 2024

Conversation

joeychilson
Copy link
Contributor

PR Checklist

  • Read the Contributing documentation.
  • Read the Code of conduct documentation.
  • Name your Pull Request title clearly, concisely, and prefixed with the name of the primarily affected package you changed according to Good commit messages (such as memory: add interfaces for X, Y or util: add whizzbang helpers).
  • Check that there isn't already a PR that solves the problem the same way to avoid creating a duplicate.
  • Provide a description in this PR that addresses what the PR is solving, or reference the issue that it solves (e.g. Fixes #123).
  • Describes the source of new concepts.
  • References existing implementations as appropriate.
  • Contains test coverage for new functions.
  • Passes all golangci-lint checks.

This adds full support for the new messages API including streaming.
It backwards compatible with client option flag for using message api.
It also updates the default model to 1.3 as 1.0 isn't supported by either completions or messages API.

Closes #695

@joeychilson joeychilson changed the title llms/anthropic: added full support for messages api llms/anthropic: adds full support for messages api Mar 21, 2024
@joeychilson
Copy link
Contributor Author

I updated this to remove the fatals from the stream processing. I think it's better to pass the errors back instead of crashing and just letting the user handle them. This is different than the other LLM implementations like OpenAI which I used as a guideline.

Copy link
Owner

@tmc tmc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great contribution! some nits but they're for the internal client.


usage, ok := event["usage"].(map[string]interface{})
if !ok {
return response, errors.New("invalid usage field type")
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

while an internal package, generally it's better to define these errors up top so calling code can use errors.Is checks.

token: os.Getenv(tokenEnvVarName),
baseURL: anthropicclient.DefaultBaseURL,
httpClient: http.DefaultClient,
useCompletionsAPI: true,
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd actually like to default to the messages API

@tmc tmc enabled auto-merge (squash) March 21, 2024 20:56
@tmc tmc merged commit 4ad2e7d into tmc:main Mar 21, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

anthropic: Support anthropic messages api
2 participants