Skip to content

Commit

Permalink
Move AI providers to separate packages. (#1326)
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel authored Apr 11, 2024
1 parent 20624a8 commit d544886
Show file tree
Hide file tree
Showing 159 changed files with 1,229 additions and 187 deletions.
5 changes: 5 additions & 0 deletions .changeset/twenty-crabs-roll.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

Breaking change: extract experimental AI core provider packages. They can now be imported with e.g. import { openai } from '@ai-sdk/openai' after adding them to a project.
34 changes: 31 additions & 3 deletions docs/pages/docs/ai-core/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: Anthropic Provider
---

import { Callout } from 'nextra-theme-docs';
import { Callout, Tabs, Tab } from 'nextra-theme-docs';

# Anthropic Provider

Expand All @@ -11,12 +11,40 @@ import { Callout } from 'nextra-theme-docs';
The Anthropic provider contains language model support for the [Anthropic Messages API](https://docs.anthropic.com/claude/reference/messages_post).
It creates language model objects that can be used with the `generateText` and `streamText`AI functions.

## Setup

The Anthropic provider is available in the `@ai-sdk/anthropic` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/anthropic
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/anthropic
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/anthropic
```

</Tab>
</Tabs>

## Provider Instance

You can import `Anthropic` from `ai/anthropic` and initialize a provider instance with various settings:

```ts
import { Anthropic } from 'ai/anthropic';
import { Anthropic } from '@ai-sdk/anthropic';

const anthropic = new Anthropic({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -27,7 +55,7 @@ const anthropic = new Anthropic({
The AI SDK also provides a shorthand `anthropic` import with a Anthropic provider instance that uses defaults:

```ts
import { anthropic } from 'ai/anthropic';
import { anthropic } from '@ai-sdk/anthropic';
```

## Messages Models
Expand Down
22 changes: 13 additions & 9 deletions docs/pages/docs/ai-core/custom-provider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,19 +14,21 @@ import { Callout } from 'nextra-theme-docs';
The AI SDK provides a language model specification.
You can write your own providers that adhere to the AI SDK language model specification and they will be compatible with the AI Core functions.

You can find the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/specification/src/language-model/v1).
It can be imported from `ai/spec`.
You can find the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1).
It can be imported from `'@ai-sdk/provider'`.

We provide an [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/core/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/core/mistral).
We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)).

There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).

## Provider Facade

A custom provider should follow the pattern of using a provider facade with factory methods for the specific providers.
An instance of the custom provider class with default settings can be exported for convenience.

```ts filename="custom-provider-facade.ts"
import { generateId, loadApiKey } from 'ai/spec';
import { generateId, loadApiKey } from ''@ai-sdk/provider-utils'';
import { CustomChatLanguageModel } from './custom-chat-language-model';
import { CustomChatModelId, CustomChatSettings } from './mistral-chat-settings';

Expand Down Expand Up @@ -76,14 +78,16 @@ export const customprovider = new CustomProvider();

## Language Model Implementation

Please refer to the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/specification/src/language-model/v1).
Please refer to the Language Model Specification in the [AI SDK repository](https://github.com/vercel/ai/tree/main/packages/provider/src/language-model/v1).

We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)).

We provide an [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/core/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/core/mistral).
There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).

### Errors

The AI SDK provides [standardized errors](https://github.com/vercel/ai/tree/main/packages/specification/src/errors) that should be used by providers where possible.
The AI SDK provides [standardized errors](https://github.com/vercel/ai/tree/main/packages/provider/src/errors) that should be used by providers where possible.
This will make it easy for user to debug them.

### Retries, timeouts, and abort signals
Expand Down
34 changes: 31 additions & 3 deletions docs/pages/docs/ai-core/google.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,47 @@
title: Google Provider
---

import { Callout } from 'nextra-theme-docs';
import { Callout, Tabs, Tab } from 'nextra-theme-docs';

# Google Provider

The Google provider contains language model support for the [Google Generative AI](https://ai.google/discover/generativeai/) APIs.
It creates language model objects that can be used with the `generateText`, `streamText`, `generateObject`, and `streamObject` AI functions.

## Setup

The Google provider is available in the `@ai-sdk/google` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/google
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/google
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/google
```

</Tab>
</Tabs>

## Provider Instance

You can import `Google` from `ai/google` and initialize a provider instance with various settings:

```ts
import { Google } from 'ai/google';
import { Google } from '@ai-sdk/google';

const google = new Google({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -25,7 +53,7 @@ const google = new Google({
The AI SDK also provides a shorthand `google` import with a Google provider instance that uses defaults:

```ts
import { google } from 'ai/google';
import { google } from '@ai-sdk/google';
```

## Generative AI Models
Expand Down
2 changes: 1 addition & 1 deletion docs/pages/docs/ai-core/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Here is a simple example for `generateText`:

```ts
import { experimental_generateText } from 'ai';
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';

const { text } = await experimental_generateText({
model: openai.chat('gpt-3.5-turbo'),
Expand Down
34 changes: 31 additions & 3 deletions docs/pages/docs/ai-core/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,47 @@
title: Mistral Provider
---

import { Callout } from 'nextra-theme-docs';
import { Callout, Tabs, Tab } from 'nextra-theme-docs';

# Mistral Provider

The Mistral provider contains language model support for the Mistral chat API.
It creates language model objects that can be used with the `generateText`, `streamText`, `generateObject`, and `streamObject` AI functions.

## Setup

The Mistral provider is available in the `@ai-sdk/mistral` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/mistral
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/mistral
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/mistral
```

</Tab>
</Tabs>

## Provider Instance

You can import `Mistral` from `ai/mistral` and initialize a provider instance with various settings:

```ts
import { Mistral } from 'ai/mistral';
import { Mistral } from '@ai-sdk/mistral';

const mistral = new Mistral({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -25,7 +53,7 @@ const mistral = new Mistral({
The AI SDK also provides a shorthand `mistral` import with a Mistral provider instance that uses defaults:

```ts
import { mistral } from 'ai/mistral';
import { mistral } from '@ai-sdk/mistral';
```

## Chat Models
Expand Down
34 changes: 31 additions & 3 deletions docs/pages/docs/ai-core/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,47 @@
title: OpenAI Provider
---

import { Callout } from 'nextra-theme-docs';
import { Callout, Tabs, Tab } from 'nextra-theme-docs';

# OpenAI Provider

The OpenAI provider contains language model support for the OpenAI chat and completion APIs.
It creates language model objects that can be used with the `generateText`, `streamText`, `generateObject`, and `streamObject` AI functions.

## Setup

The OpenAI provider is available in the `@ai-sdk/openai` module. You can install it with

<Tabs items={['pnpm', 'npm', 'yarn']}>
<Tab>

```bash
pnpm add @ai-sdk/openai
```

</Tab>
<Tab>

```bash
npm i @ai-sdk/openai
```

</Tab>
<Tab>

```bash
yarn add @ai-sdk/openai
```

</Tab>
</Tabs>

## Provider Instance

You can import `OpenAI` from `ai/openai` and initialize a provider instance with various settings:

```ts
import { OpenAI } from 'ai/openai'
import { OpenAI } from '@ai-sdk/openai'

const openai = new OpenAI({
baseUrl: '', // optional base URL for proxies etc.
Expand All @@ -26,7 +54,7 @@ const openai = new OpenAI({
The AI SDK also provides a shorthand `openai` import with an OpenAI provider instance that uses defaults:

```ts
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';
```

## Chat Models
Expand Down
8 changes: 4 additions & 4 deletions docs/pages/docs/ai-core/stream-text.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ by the [`useCompletion`](/docs/api-reference/use-completion) hook.

```ts
import { StreamingTextResponse, experimental_streamText } from 'ai';
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';

export const runtime = 'edge';

Expand All @@ -114,7 +114,7 @@ by the [`useChat`](/docs/api-reference/use-chat) hook.

```ts
import { StreamingTextResponse, experimental_streamText } from 'ai';
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';

export const runtime = 'edge';

Expand All @@ -135,7 +135,7 @@ export async function POST(req: Request) {

```ts
import { ExperimentalMessage, experimental_streamText } from 'ai';
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';
import * as readline from 'node:readline/promises';

const terminal = readline.createInterface({
Expand Down Expand Up @@ -181,7 +181,7 @@ import {
ToolResultPart,
experimental_streamText,
} from 'ai';
import { openai } from 'ai/openai';
import { openai } from '@ai-sdk/openai';
import * as readline from 'node:readline/promises';

const terminal = readline.createInterface({
Expand Down
10 changes: 9 additions & 1 deletion examples/ai-core/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,21 @@
"version": "0.0.0",
"private": true,
"dependencies": {
"@ai-sdk/anthropic": "latest",
"@ai-sdk/google": "latest",
"@ai-sdk/mistral": "latest",
"@ai-sdk/openai": "latest",
"ai": "latest",
"dotenv": "16.4.5",
"zod": "3.22.4",
"zod-to-json-schema": "3.22.4"
},
"scripts": {
"type-check": "tsc --noEmit"
},
"devDependencies": {
"@types/node": "20.11.20",
"tsx": "4.7.1"
"tsx": "4.7.1",
"typescript": "5.1.3"
}
}
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/anthropic.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { experimental_generateObject } from 'ai';
import { anthropic } from 'ai/anthropic';
import { anthropic } from '@ai-sdk/anthropic';
import dotenv from 'dotenv';
import { z } from 'zod';

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/mistral-json.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { experimental_generateObject } from 'ai';
import { Mistral } from 'ai/mistral';
import { Mistral } from '@ai-sdk/mistral';
import dotenv from 'dotenv';
import { z } from 'zod';

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/mistral-tool.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { experimental_generateObject } from 'ai';
import { Mistral } from 'ai/mistral';
import { Mistral } from '@ai-sdk/mistral';
import dotenv from 'dotenv';
import { z } from 'zod';

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/mistral.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { experimental_generateObject } from 'ai';
import { Mistral } from 'ai/mistral';
import { Mistral } from '@ai-sdk/mistral';
import dotenv from 'dotenv';
import { z } from 'zod';

Expand Down
2 changes: 1 addition & 1 deletion examples/ai-core/src/generate-object/openai-json.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { experimental_generateObject } from 'ai';
import { OpenAI } from 'ai/openai';
import { OpenAI } from '@ai-sdk/openai';
import dotenv from 'dotenv';
import { z } from 'zod';

Expand Down
Loading

0 comments on commit d544886

Please sign in to comment.