Skip to content

Commit

Permalink
feat: Support FastChat OpenAI-compatible API (#444)
Browse files Browse the repository at this point in the history
  • Loading branch information
davidmigloz authored May 29, 2024
1 parent 1549bbb commit ddaf1f6
Show file tree
Hide file tree
Showing 8 changed files with 601 additions and 570 deletions.
2 changes: 1 addition & 1 deletion packages/openai_dart/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Unofficial Dart client for [OpenAI](https://platform.openai.com/docs/api-referen
- Custom base URL, headers and query params support (e.g. HTTP proxies)
- Custom HTTP client support (e.g. SOCKS5 proxies or advanced use cases)
- Partial Azure OpenAI API support
- It can be used to consume OpenAI-compatible APIs like [TogetherAI](https://www.together.ai/), [Anyscale](https://www.anyscale.com/), [OpenRouter](https://openrouter.ai), [One API](https://github.com/songquanpeng/one-api), [Groq](https://groq.com/), [Llamafile](https://llamafile.ai/), [GPT4All](https://gpt4all.io/), etc.
- It can be used to consume OpenAI-compatible APIs like [TogetherAI](https://www.together.ai/), [Anyscale](https://www.anyscale.com/), [OpenRouter](https://openrouter.ai), [One API](https://github.com/songquanpeng/one-api), [Groq](https://groq.com/), [Llamafile](https://llamafile.ai/), [GPT4All](https://gpt4all.io/), [FastChat](https://github.com/lm-sys/FastChat), etc.

**Supported endpoints:**

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class CreateChatCompletionStreamResponse
required List<ChatCompletionStreamResponseChoice> choices,

/// The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.
required int created,
@JsonKey(includeIfNull: false) int? created,

/// The model to generate the completion.
@JsonKey(includeIfNull: false) String? model,
Expand All @@ -36,7 +36,7 @@ class CreateChatCompletionStreamResponse
String? systemFingerprint,

/// The object type, which is always `chat.completion.chunk`.
required String object,
@JsonKey(includeIfNull: false) String? object,

/// Usage statistics for the completion request.
@JsonKey(includeIfNull: false) CompletionUsage? usage,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ class FunctionObject with _$FunctionObject {
/// A description of what the function does, used by the model to choose when and how to call the function.
@JsonKey(includeIfNull: false) String? description,

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
@JsonKey(includeIfNull: false) FunctionParameters? parameters,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ part of open_a_i_schema;
// TYPE: FunctionParameters
// ==========================================

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
typedef FunctionParameters = Map<String, dynamic>;
66 changes: 36 additions & 30 deletions packages/openai_dart/lib/src/generated/schema/schema.freezed.dart
Original file line number Diff line number Diff line change
Expand Up @@ -6695,7 +6695,7 @@ mixin _$FunctionObject {
@JsonKey(includeIfNull: false)
String? get description => throw _privateConstructorUsedError;

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
@JsonKey(includeIfNull: false)
Expand Down Expand Up @@ -6821,12 +6821,12 @@ class _$FunctionObjectImpl extends _FunctionObject {
@JsonKey(includeIfNull: false)
final String? description;

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
final Map<String, dynamic>? _parameters;

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
@override
Expand Down Expand Up @@ -6898,7 +6898,7 @@ abstract class _FunctionObject extends FunctionObject {
String? get description;
@override

/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
/// The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format.
///
/// Omitting `parameters` defines a function with an empty parameter list.
@JsonKey(includeIfNull: false)
Expand Down Expand Up @@ -9004,7 +9004,8 @@ mixin _$CreateChatCompletionStreamResponse {
throw _privateConstructorUsedError;

/// The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.
int get created => throw _privateConstructorUsedError;
@JsonKey(includeIfNull: false)
int? get created => throw _privateConstructorUsedError;

/// The model to generate the completion.
@JsonKey(includeIfNull: false)
Expand All @@ -9017,7 +9018,8 @@ mixin _$CreateChatCompletionStreamResponse {
String? get systemFingerprint => throw _privateConstructorUsedError;

/// The object type, which is always `chat.completion.chunk`.
String get object => throw _privateConstructorUsedError;
@JsonKey(includeIfNull: false)
String? get object => throw _privateConstructorUsedError;

/// Usage statistics for the completion request.
@JsonKey(includeIfNull: false)
Expand All @@ -9041,11 +9043,11 @@ abstract class $CreateChatCompletionStreamResponseCopyWith<$Res> {
$Res call(
{@JsonKey(includeIfNull: false) String? id,
List<ChatCompletionStreamResponseChoice> choices,
int created,
@JsonKey(includeIfNull: false) int? created,
@JsonKey(includeIfNull: false) String? model,
@JsonKey(name: 'system_fingerprint', includeIfNull: false)
String? systemFingerprint,
String object,
@JsonKey(includeIfNull: false) String? object,
@JsonKey(includeIfNull: false) CompletionUsage? usage});

$CompletionUsageCopyWith<$Res>? get usage;
Expand All @@ -9067,10 +9069,10 @@ class _$CreateChatCompletionStreamResponseCopyWithImpl<$Res,
$Res call({
Object? id = freezed,
Object? choices = null,
Object? created = null,
Object? created = freezed,
Object? model = freezed,
Object? systemFingerprint = freezed,
Object? object = null,
Object? object = freezed,
Object? usage = freezed,
}) {
return _then(_value.copyWith(
Expand All @@ -9082,10 +9084,10 @@ class _$CreateChatCompletionStreamResponseCopyWithImpl<$Res,
? _value.choices
: choices // ignore: cast_nullable_to_non_nullable
as List<ChatCompletionStreamResponseChoice>,
created: null == created
created: freezed == created
? _value.created
: created // ignore: cast_nullable_to_non_nullable
as int,
as int?,
model: freezed == model
? _value.model
: model // ignore: cast_nullable_to_non_nullable
Expand All @@ -9094,10 +9096,10 @@ class _$CreateChatCompletionStreamResponseCopyWithImpl<$Res,
? _value.systemFingerprint
: systemFingerprint // ignore: cast_nullable_to_non_nullable
as String?,
object: null == object
object: freezed == object
? _value.object
: object // ignore: cast_nullable_to_non_nullable
as String,
as String?,
usage: freezed == usage
? _value.usage
: usage // ignore: cast_nullable_to_non_nullable
Expand Down Expand Up @@ -9130,11 +9132,11 @@ abstract class _$$CreateChatCompletionStreamResponseImplCopyWith<$Res>
$Res call(
{@JsonKey(includeIfNull: false) String? id,
List<ChatCompletionStreamResponseChoice> choices,
int created,
@JsonKey(includeIfNull: false) int? created,
@JsonKey(includeIfNull: false) String? model,
@JsonKey(name: 'system_fingerprint', includeIfNull: false)
String? systemFingerprint,
String object,
@JsonKey(includeIfNull: false) String? object,
@JsonKey(includeIfNull: false) CompletionUsage? usage});

@override
Expand All @@ -9156,10 +9158,10 @@ class __$$CreateChatCompletionStreamResponseImplCopyWithImpl<$Res>
$Res call({
Object? id = freezed,
Object? choices = null,
Object? created = null,
Object? created = freezed,
Object? model = freezed,
Object? systemFingerprint = freezed,
Object? object = null,
Object? object = freezed,
Object? usage = freezed,
}) {
return _then(_$CreateChatCompletionStreamResponseImpl(
Expand All @@ -9171,10 +9173,10 @@ class __$$CreateChatCompletionStreamResponseImplCopyWithImpl<$Res>
? _value._choices
: choices // ignore: cast_nullable_to_non_nullable
as List<ChatCompletionStreamResponseChoice>,
created: null == created
created: freezed == created
? _value.created
: created // ignore: cast_nullable_to_non_nullable
as int,
as int?,
model: freezed == model
? _value.model
: model // ignore: cast_nullable_to_non_nullable
Expand All @@ -9183,10 +9185,10 @@ class __$$CreateChatCompletionStreamResponseImplCopyWithImpl<$Res>
? _value.systemFingerprint
: systemFingerprint // ignore: cast_nullable_to_non_nullable
as String?,
object: null == object
object: freezed == object
? _value.object
: object // ignore: cast_nullable_to_non_nullable
as String,
as String?,
usage: freezed == usage
? _value.usage
: usage // ignore: cast_nullable_to_non_nullable
Expand All @@ -9202,11 +9204,11 @@ class _$CreateChatCompletionStreamResponseImpl
const _$CreateChatCompletionStreamResponseImpl(
{@JsonKey(includeIfNull: false) this.id,
required final List<ChatCompletionStreamResponseChoice> choices,
required this.created,
@JsonKey(includeIfNull: false) this.created,
@JsonKey(includeIfNull: false) this.model,
@JsonKey(name: 'system_fingerprint', includeIfNull: false)
this.systemFingerprint,
required this.object,
@JsonKey(includeIfNull: false) this.object,
@JsonKey(includeIfNull: false) this.usage})
: _choices = choices,
super._();
Expand Down Expand Up @@ -9235,7 +9237,8 @@ class _$CreateChatCompletionStreamResponseImpl

/// The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.
@override
final int created;
@JsonKey(includeIfNull: false)
final int? created;

/// The model to generate the completion.
@override
Expand All @@ -9251,7 +9254,8 @@ class _$CreateChatCompletionStreamResponseImpl

/// The object type, which is always `chat.completion.chunk`.
@override
final String object;
@JsonKey(includeIfNull: false)
final String? object;

/// Usage statistics for the completion request.
@override
Expand Down Expand Up @@ -9311,11 +9315,11 @@ abstract class _CreateChatCompletionStreamResponse
const factory _CreateChatCompletionStreamResponse(
{@JsonKey(includeIfNull: false) final String? id,
required final List<ChatCompletionStreamResponseChoice> choices,
required final int created,
@JsonKey(includeIfNull: false) final int? created,
@JsonKey(includeIfNull: false) final String? model,
@JsonKey(name: 'system_fingerprint', includeIfNull: false)
final String? systemFingerprint,
required final String object,
@JsonKey(includeIfNull: false) final String? object,
@JsonKey(includeIfNull: false) final CompletionUsage? usage}) =
_$CreateChatCompletionStreamResponseImpl;
const _CreateChatCompletionStreamResponse._() : super._();
Expand All @@ -9337,7 +9341,8 @@ abstract class _CreateChatCompletionStreamResponse
@override

/// The Unix timestamp (in seconds) of when the chat completion was created. Each chunk has the same timestamp.
int get created;
@JsonKey(includeIfNull: false)
int? get created;
@override

/// The model to generate the completion.
Expand All @@ -9353,7 +9358,8 @@ abstract class _CreateChatCompletionStreamResponse
@override

/// The object type, which is always `chat.completion.chunk`.
String get object;
@JsonKey(includeIfNull: false)
String? get object;
@override

/// Usage statistics for the completion request.
Expand Down
8 changes: 4 additions & 4 deletions packages/openai_dart/lib/src/generated/schema/schema.g.dart

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

13 changes: 9 additions & 4 deletions packages/openai_dart/oas/openapi_curated.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2212,7 +2212,7 @@ components:
- name
FunctionParameters:
type: object
description: "The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/text-generation/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. \n\nOmitting `parameters` defines a function with an empty parameter list."
description: "The parameters the functions accepts, described as a JSON Schema object. See the [guide](https://platform.openai.com/docs/guides/function-calling) for examples, and the [JSON Schema reference](https://json-schema.org/understanding-json-schema/) for documentation about the format. \n\nOmitting `parameters` defines a function with an empty parameter list."
additionalProperties: true
ChatCompletionTool:
type: object
Expand Down Expand Up @@ -2426,10 +2426,10 @@ components:
$ref: "#/components/schemas/CompletionUsage"
required:
- choices
- created
# - created # Made nullable to support FastChat API which doesn't return this field with some models
# - id # Made nullable to support OpenRouter API which doesn't return this field with some models
# - model # Made nullable to support TogetherAI API which doesn't return this field with some models
- object
# - object # Made nullable to support FastChat API which doesn't return this field with some models
ChatCompletionStreamResponseChoice:
type: object
description: A choice the model generated for the input prompt.
Expand Down Expand Up @@ -6128,7 +6128,12 @@ components:
nullable: true
BatchEndpoint:
type: string
enum: [ "/v1/chat/completions", "/v1/embeddings", "/v1/completions" ]
enum:
[
"/v1/chat/completions",
"/v1/embeddings",
"/v1/completions",
]
description: The endpoint to be used for all requests in the batch. Currently `/v1/chat/completions`, `/v1/embeddings`, and `/v1/completions` are supported. Note that `/v1/embeddings` batches are also restricted to a maximum of 50,000 embedding inputs across all requests in the batch.
BatchCompletionWindow:
type: string
Expand Down
Loading

0 comments on commit ddaf1f6

Please sign in to comment.