Skip to content

Commit

Permalink
standardize providers to offer .chat() method (#1383)
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel authored Apr 18, 2024
1 parent e94fb32 commit 587240b
Show file tree
Hide file tree
Showing 23 changed files with 59 additions and 33 deletions.
6 changes: 6 additions & 0 deletions .changeset/empty-houses-kick.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
---
'@ai-sdk/anthropic': patch
'@ai-sdk/google': patch
---

Standardize providers to offer .chat() method
6 changes: 3 additions & 3 deletions docs/pages/docs/ai-core/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -60,19 +60,19 @@ import { anthropic } from '@ai-sdk/anthropic';

## Messages Models

You can create models that call the [Anthropic Messages API](https://docs.anthropic.com/claude/reference/messages_post) using the `.messages()` factory method.
You can create models that call the [Anthropic Messages API](https://docs.anthropic.com/claude/reference/messages_post) using the `.chat()` factory method.
The first argument is the model id, e.g. `claude-3-haiku-20240307`.
Some models have multi-modal capabilities.

```ts
const model = anthropic.messages('claude-3-haiku-20240307');
const model = anthropic.chat('claude-3-haiku-20240307');
```

Anthropic Messages` models support also some model specific settings that are not part of the [standard call settings](/docs/ai-core/settings).
You can pass them as an options argument:

```ts
const model = anthropic.messages('claude-3-haiku-20240307', {
const model = anthropic.chat('claude-3-haiku-20240307', {
topK: 0.2,
});
```
6 changes: 3 additions & 3 deletions docs/pages/docs/ai-core/google.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -58,19 +58,19 @@ import { google } from '@ai-sdk/google';

## Generative AI Models

You can create models that call the [Google Generative AI API](https://ai.google.dev/api/rest) using the `.generativeAI()` factory method.
You can create models that call the [Google Generative AI API](https://ai.google.dev/api/rest) using the `.chat()` factory method.
The first argument is the model id, e.g. `models/gemini-pro`.
The models support tool calls and some have multi-modal capabilities.

```ts
const model = google.generativeAI('models/gemini-pro');
const model = google.chat('models/gemini-pro');
```

Google Generative AI models support also some model specific settings that are not part of the [standard call settings](/docs/ai-core/settings).
You can pass them as an options argument:

```ts
const model = google.generativeAI('models/gemini-pro', {
const model = google.chat('models/gemini-pro', {
topK: 0.2,
});
```
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/anthropic.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateObject({
model: anthropic.messages('claude-3-opus-20240229'),
model: anthropic.chat('claude-3-opus-20240229'),
schema: z.object({
recipe: z.object({
name: z.string(),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ async function main() {
}

const { text, toolCalls, toolResults } = await experimental_generateText({
model: anthropic.messages('claude-3-opus-20240229'),
model: anthropic.chat('claude-3-opus-20240229'),
tools: { weatherTool },
system: `You are a helpful, respectful and honest assistant.`,
messages,
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/anthropic-multimodal.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: anthropic.messages('claude-3-haiku-20240307'),
model: anthropic.chat('claude-3-haiku-20240307'),
maxTokens: 512,
messages: [
{
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/anthropic-tool-call.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: anthropic.messages('claude-3-opus-20240229'),
model: anthropic.chat('claude-3-opus-20240229'),
maxTokens: 512,
tools: {
weather: weatherTool,
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/anthropic.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: anthropic.messages('claude-3-haiku-20240307'),
model: anthropic.chat('claude-3-haiku-20240307'),
prompt: 'Invent a new holiday and describe its traditions.',
});

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/google-multimodal.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: google.generativeAI('models/gemini-pro-vision'),
model: google.chat('models/gemini-pro-vision'),
maxTokens: 512,
messages: [
{
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/google-tool-call.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
maxTokens: 512,
tools: {
weather: weatherTool,
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-text/google.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ dotenv.config();

async function main() {
const result = await experimental_generateText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
prompt: 'Invent a new holiday and describe its traditions.',
});

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/stream-text/anthropic-multimodal.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ dotenv.config();

async function main() {
const result = await experimental_streamText({
model: anthropic.messages('claude-3-haiku-20240307'),
model: anthropic.chat('claude-3-haiku-20240307'),
maxTokens: 512,
messages: [
{
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/stream-text/anthropic.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ dotenv.config();

async function main() {
const result = await experimental_streamText({
model: anthropic.messages('claude-3-haiku-20240307'),
model: anthropic.chat('claude-3-haiku-20240307'),
prompt: 'Invent a new holiday and describe its traditions.',
});

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ async function main() {
}

const result = await experimental_streamText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
tools: { weatherTool },
system: `You are a helpful, respectful and honest assistant.`,
messages,
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/stream-text/google-chatbot.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ async function main() {
messages.push({ role: 'user', content: userInput });

const result = await experimental_streamText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
system: `You are a helpful, respectful and honest assistant.`,
messages,
});
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/stream-text/google-fullstream.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ dotenv.config();

async function main() {
const result = await experimental_streamText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
tools: {
weather: weatherTool,
cityAttractions: {
Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/stream-text/google.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ dotenv.config();

async function main() {
const result = await experimental_streamText({
model: google.generativeAI('models/gemini-pro'),
model: google.chat('models/gemini-pro'),
prompt: 'Invent a new holiday and describe its traditions.',
});

Expand Down
6 changes: 3 additions & 3 deletions packages/anthropic/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,19 +34,19 @@ import { anthropic } from '@ai-sdk/anthropic';

## Messages Models

You can create models that call the [Anthropic Messages API](https://docs.anthropic.com/claude/reference/messages_post) using the `.messages()` factory method.
You can create models that call the [Anthropic Messages API](https://docs.anthropic.com/claude/reference/messages_post) using the `.chat()` factory method.
The first argument is the model id, e.g. `claude-3-haiku-20240307`.
Some models have multi-modal capabilities.

```ts
const model = anthropic.messages('claude-3-haiku-20240307');
const model = anthropic.chat('claude-3-haiku-20240307');
```

Anthropic Messages` models support also some model specific settings that are not part of the [standard call settings](/docs/ai-core/settings).
You can pass them as an options argument:

```ts
const model = anthropic.messages('claude-3-haiku-20240307', {
const model = anthropic.chat('claude-3-haiku-20240307', {
topK: 0.2,
});
```
10 changes: 10 additions & 0 deletions packages/anthropic/src/anthropic-facade.ts
Original file line number Diff line number Diff line change
Expand Up @@ -60,9 +60,19 @@ export class Anthropic {
};
}

/**
* @deprecated Use `chat()` instead.
*/
messages(
modelId: AnthropicMessagesModelId,
settings: AnthropicMessagesSettings = {},
) {
return this.chat(modelId, settings);
}

chat(
modelId: AnthropicMessagesModelId,
settings: AnthropicMessagesSettings = {},
) {
return new AnthropicMessagesLanguageModel(modelId, settings, {
provider: 'anthropic.messages',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ const anthropic = new Anthropic({
apiKey: 'test-api-key',
});

const model = anthropic.messages('claude-3-haiku-20240307');
const model = anthropic.chat('claude-3-haiku-20240307');

describe('doGenerate', () => {
const server = new JsonTestServer('https://api.anthropic.com/v1/messages');
Expand Down Expand Up @@ -52,7 +52,7 @@ describe('doGenerate', () => {
it('should extract text response', async () => {
prepareJsonResponse({ content: [{ type: 'text', text: 'Hello, World!' }] });

const { text } = await anthropic.messages('gpt-3.5-turbo').doGenerate({
const { text } = await anthropic.chat('gpt-3.5-turbo').doGenerate({
inputFormat: 'prompt',
mode: { type: 'regular' },
prompt: TEST_PROMPT,
Expand Down Expand Up @@ -204,7 +204,7 @@ describe('doGenerate', () => {
apiKey: 'test-api-key',
});

await anthropic.messages('claude-3-haiku-20240307').doGenerate({
await anthropic.chat('claude-3-haiku-20240307').doGenerate({
inputFormat: 'prompt',
mode: { type: 'regular' },
prompt: TEST_PROMPT,
Expand Down Expand Up @@ -283,7 +283,7 @@ describe('doStream', () => {
apiKey: 'test-api-key',
});

await anthropic.messages('claude-3-haiku-2024').doStream({
await anthropic.chat('claude-3-haiku-2024').doStream({
inputFormat: 'prompt',
mode: { type: 'regular' },
prompt: TEST_PROMPT,
Expand Down
6 changes: 3 additions & 3 deletions packages/google/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,19 +30,19 @@ import { google } from '@ai-sdk/google';

## Generative AI Models

You can create models that call the [Google Generative AI API](https://ai.google.dev/api/rest) using the `.generativeAI()` factory method.
You can create models that call the [Google Generative AI API](https://ai.google.dev/api/rest) using the `.chat()` factory method.
The first argument is the model id, e.g. `models/gemini-pro`.
The models support tool calls and some have multi-modal capabilities.

```ts
const model = google.generativeAI('models/gemini-pro');
const model = google.chat('models/gemini-pro');
```

Google Generative AI models support also some model specific settings that are not part of the [standard call settings](/docs/ai-core/settings).
You can pass them as an options argument:

```ts
const model = google.generativeAI('models/gemini-pro', {
const model = google.chat('models/gemini-pro', {
topK: 0.2,
});
```
10 changes: 10 additions & 0 deletions packages/google/src/google-facade.ts
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,19 @@ export class Google {
};
}

/**
* @deprecated Use `chat()` instead.
*/
generativeAI(
modelId: GoogleGenerativeAIModelId,
settings: GoogleGenerativeAISettings = {},
) {
return this.chat(modelId, settings);
}

chat(
modelId: GoogleGenerativeAIModelId,
settings: GoogleGenerativeAISettings = {},
) {
return new GoogleGenerativeAILanguageModel(modelId, settings, {
provider: 'google.generative-ai',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ const google = new Google({
apiKey: 'test-api-key',
generateId: () => 'test-id',
});
const model = google.generativeAI('models/gemini-pro');
const model = google.chat('models/gemini-pro');

describe('doGenerate', () => {
const server = new JsonTestServer(
Expand Down Expand Up @@ -151,7 +151,7 @@ describe('doGenerate', () => {
prepareJsonResponse({ content: '' });

const google = new Google({ apiKey: 'test-api-key' });
const model = google.generativeAI('models/gemini-pro');
const model = google.chat('models/gemini-pro');

await model.doGenerate({
inputFormat: 'prompt',
Expand Down Expand Up @@ -230,7 +230,7 @@ describe('doStream', () => {

const google = new Google({ apiKey: 'test-api-key' });

await google.generativeAI('models/gemini-pro').doStream({
await google.chat('models/gemini-pro').doStream({
inputFormat: 'prompt',
mode: { type: 'regular' },
prompt: TEST_PROMPT,
Expand Down

0 comments on commit 587240b

Please sign in to comment.