Skip to content

Commit

Permalink
chore: update readme
Browse files Browse the repository at this point in the history
chore: bump to 0.0.9
  • Loading branch information
moesmufti committed Dec 17, 2024
1 parent 83799be commit edc4fdf
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 19 deletions.
40 changes: 22 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,24 +65,7 @@ print(response_message)

### Optional Parameters

The `invoke` method supports the following optional parameters:

- `frequency_penalty` (float, default=0): Penalty for token frequency
- `logit_bias` (Dict[str, float], default=None): Token bias dictionary
- `logprobs` (bool, default=False): Whether to return log probabilities
- `max_tokens` (int, default=0): Maximum number of tokens to generate
- `n` (int, default=0): Number of completions to generate
- `presence_penalty` (float, default=0): Penalty for token presence
- `response_format` (Dict[str, str], default=None): Format of the response
- `seed` (int, default=0): Random seed for reproducibility
- `stop` (Union[str, List[str]], default=None): Stop sequences
- `stream` (bool, default=False): Whether to stream the response
- `stream_options` (Dict[str, Any], default=None): Options for streaming
- `temperature` (float, default=0): Sampling temperature
- `tool_choice` (Union[str, Dict[str, Any]], default=None): Function calling mode ('auto', 'required', 'none', or specific function)
- `top_logprobs` (int, default=0): Number of top log probabilities to return
- `top_p` (float, default=0): Top-p sampling parameter
- `user` (str, default=""): End-user identifier
The `invoke` method supports various optional parameters to customize the model's behavior. Some commonly used parameters include `max_tokens` to limit response length, `tool_choice` to control function calling behavior ('auto', 'required', 'none'), and others. For a complete list of optional parameters and their descriptions, see the [API Reference](#api-reference) section.

## Function Calling

Expand Down Expand Up @@ -209,6 +192,27 @@ def invoke(
) -> ChatCompletionResponse
```

#### Invoke Method Optional Parameters

The `invoke` method supports the following optional parameters:

- `frequency_penalty` (float, default=0): Penalty for token frequency. Higher values decrease the model's likelihood to repeat the same information.
- `logit_bias` (Dict[str, float], default=None): Token bias dictionary to influence token selection.
- `logprobs` (bool, default=False): Whether to return log probabilities of the output tokens.
- `max_tokens` (int, default=0): Maximum number of tokens to generate in the response.
- `n` (int, default=0): Number of chat completion choices to generate.
- `presence_penalty` (float, default=0): Penalty for token presence. Higher values increase the model's likelihood to talk about new topics.
- `response_format` (Dict[str, str], default=None): Format specification for the response output.
- `seed` (int, default=0): Random seed for deterministic outputs.
- `stop` (Union[str, List[str]], default=None): Sequences where the model should stop generating.
- `stream` (bool, default=False): Whether to stream the response tokens as they're generated.
- `stream_options` (Dict[str, Any], default=None): Additional options for streaming responses.
- `temperature` (float, default=0): Sampling temperature. Higher values make output more random.
- `tool_choice` (Union[str, Dict[str, Any]], default=None): Function calling mode ('auto', 'required', 'none', or specific function).
- `top_logprobs` (int, default=0): Number of most likely tokens to return probabilities for.
- `top_p` (float, default=0): Top-p sampling parameter. Lower values make output more focused.
- `user` (str, default=""): End-user identifier for monitoring and rate limiting.

### Response Models

The SDK uses several dataclasses to represent the API response structure:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "xai-grok-sdk"
version = "0.0.8"
version = "0.0.9"
description = "Lightweight xAI SDK with minimal dependencies"
dependencies = [
"requests>=2.32.3",
Expand Down

0 comments on commit edc4fdf

Please sign in to comment.