Skip to content

Commit

Permalink
[doc] Fix chat completions input output schema doc (#1778)
Browse files Browse the repository at this point in the history
  • Loading branch information
xyang16 authored Apr 16, 2024
1 parent fd7479f commit 472e117
Showing 1 changed file with 16 additions and 15 deletions.
31 changes: 16 additions & 15 deletions serving/docs/lmi/user_guides/chat_input_output_schema.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ Example response:

Chat Completions API supports streaming, and the response format for streaming differs from the response format for non-streaming.

To use streaming, set `"stream": true`, or `option.output_formatter=jsonlines`).
To use streaming, set `"stream": true`, or `option.output_formatter=jsonlines`.

The response is returned token by token as application/jsonlines content-type:

Expand All @@ -110,7 +110,7 @@ The response is returned token by token as application/jsonlines content-type:
Example response:

```
{"id": "chatcmpl-0", "object": "chat.completion.chunk", "created": 1712792433, "choices": [{"index": 0, "delta": {"content": " Oh"}, "logprobs": [{"content": [{"token": " Oh", "logprob": -4.499478340148926, "bytes": [32, 79, 104], "top_logprobs": [{"token": -4.499478340148926, "logprob": -4.499478340148926, "bytes": [32, 79, 104]}]}]}], "finish_reason": null}]}
{"id": "chatcmpl-0", "object": "chat.completion.chunk", "created": 1712792433, "choices": [{"index": 0, "delta": {"content": " Oh", "role": "assistant"}, "logprobs": [{"content": [{"token": " Oh", "logprob": -4.499478340148926, "bytes": [32, 79, 104], "top_logprobs": [{"token": -4.499478340148926, "logprob": -4.499478340148926, "bytes": [32, 79, 104]}]}]}], "finish_reason": null}]}
...
{"id": "chatcmpl-0", "object": "chat.completion.chunk", "created": 1712792436, "choices": [{"index": 0, "delta": {"content": " assist"}, "logprobs": [{"content": [{"token": " assist", "logprob": -1.019672155380249, "bytes": [32, 97, 115, 115, 105, 115, 116], "top_logprobs": [{"token": -1.019672155380249, "logprob": -1.019672155380249, "bytes": [32, 97, 115, 115, 105, 115, 116]}]}]}], "finish_reason": "length"}]}
```
Expand Down Expand Up @@ -143,12 +143,12 @@ Example:
The choice object represents a chat completion choice.
It contains the following fields:

| Field Name | Type | Description | Example |
|-----------------|----------------------|---------------------------------------------------|-------------------------------------------|
| `index` | int | The index of the choice | 0 |
| `message` | [Message](#message) | A chat completion message generated by the model. | See the [Message](#message) documentation |
| `logprobs` | [Logprobs](#logprob) | The log probability of the token | See the [Logprobs](#logprob) documentation |
| `finish_reason` | string enum | The reason the model stopped generating tokens | "length", "eos_token", "stop_sequence" |
| Field Name | Type | Description | Example |
|-----------------|-----------------------|---------------------------------------------------|-------------------------------------------|
| `index` | int | The index of the choice | 0 |
| `message` | [Message](#message) | A chat completion message generated by the model. | See the [Message](#message) documentation |
| `logprobs` | [Logprobs](#logprobs) | The log probability of the token | See the [Logprobs](#logprob) documentation |
| `finish_reason` | string enum | The reason the model stopped generating tokens | "length", "eos_token", "stop_sequence" |

Example:

Expand All @@ -169,17 +169,17 @@ Example:
The choice object represents a chat completion choice.
It contains the following fields:

| Field Name | Type | Description | Example |
|-----------------|----------------------|----------------------------------------------------------------|--------------------------------------------|
| `index` | int | The index of the choice | 0 |
| `delta` | [Message](#message) | A chat completion delta generated by streamed model responses. | See the [Message](#message) documentation |
| `logprobs` | [Logprobs](#logprob) | The log probability of the token | See the [Logprobs](#logprob) documentation |
| `finish_reason` | string enum | The reason the model stopped generating tokens | "length", "eos_token", "stop_sequence" |
| Field Name | Type | Description | Example |
|-----------------|-----------------------|----------------------------------------------------------------|---------------------------------------------|
| `index` | int | The index of the choice | 0 |
| `delta` | [Message](#message) | A chat completion delta generated by streamed model responses. | See the [Message](#message) documentation |
| `logprobs` | [Logprobs](#logprobs) | The log probability of the token | See the [Logprobs](#logprobs) documentation |
| `finish_reason` | string enum | The reason the model stopped generating tokens | "length", "eos_token", "stop_sequence" |

Example:

```
{"index": 0, "delta": {"content": " Oh"}, "logprobs": [{"content": [{"token": " Oh", "logprob": -4.499478340148926, "bytes": [32, 79, 104], "top_logprobs": [{"token": -4.499478340148926, "logprob": -4.499478340148926, "bytes": [32, 79, 104]}]}]}
{"index": 0, "delta": {"content": " Oh", "role": "assistant"}, "logprobs": [{"content": [{"token": " Oh", "logprob": -4.499478340148926, "bytes": [32, 79, 104], "top_logprobs": [{"token": -4.499478340148926, "logprob": -4.499478340148926, "bytes": [32, 79, 104]}]}]}
```

### Logprobs
Expand Down Expand Up @@ -232,6 +232,7 @@ Example:
### TopLogprob

Top log probability information for the choice.
It contains the following fields:

| Field Name | Type | Description | Example |
|----------------|------------------------------------|------------------------------------------------------------|-------------------------------------------------|
Expand Down

0 comments on commit 472e117

Please sign in to comment.