Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Add support for Mistral via the mistral Python SDK #374

Merged
merged 32 commits into from
Nov 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
ef626e9
add mistral support
the-praxs Aug 30, 2024
ce710cd
linting
the-praxs Aug 30, 2024
85cc64f
fix typo
the-praxs Aug 30, 2024
4ef3726
add tests
the-praxs Aug 30, 2024
9986ce4
add examples notebook
the-praxs Aug 30, 2024
313e445
linting
the-praxs Aug 30, 2024
2f6e6fa
fix langchain typo in pyproject.toml (updated to 0.2.14)
the-praxs Aug 30, 2024
b74200e
fix mistralai import and `undo_override` function
the-praxs Aug 30, 2024
1006c4b
add mistral to readme
the-praxs Aug 30, 2024
2f88740
fix typo
the-praxs Aug 30, 2024
72c4422
Merge branch 'main' into mistral-ops
the-praxs Sep 7, 2024
0735e4b
Merge branch 'main' into mistral-ops
the-praxs Sep 10, 2024
4f11622
Merge branch 'main' into mistral-ops
the-praxs Sep 18, 2024
c42c803
modified self.llm_event to llm_event
the-praxs Sep 18, 2024
2b5bd7a
Merge branch 'main' into mistral-ops
the-praxs Sep 19, 2024
b9b4b18
refactoring
the-praxs Sep 19, 2024
5791058
black
the-praxs Sep 19, 2024
de51dd5
Merge branch 'main' into mistral-ops
the-praxs Sep 20, 2024
e90211f
Merge branch 'main' into mistral-ops
the-praxs Sep 22, 2024
23744e6
Merge branch 'main' into mistral-ops
the-praxs Sep 27, 2024
aa663c7
Merge branch 'main' into mistral-ops
the-praxs Oct 4, 2024
f19e2b3
Merge branch 'main' into mistral-ops
the-praxs Oct 15, 2024
96d3b1e
Merge branch 'main' into mistral-ops
the-praxs Oct 24, 2024
f0bf29d
Merge branch 'main' into mistral-ops
the-praxs Nov 1, 2024
af87cd2
rename examples directory
the-praxs Nov 1, 2024
f8a3fa9
Merge branch 'main' into mistral-ops
the-praxs Nov 4, 2024
0e3dbb9
Merge branch 'main' into mistral-ops
the-praxs Nov 4, 2024
42c818a
Merge branch 'main' into mistral-ops
areibman Nov 5, 2024
32b1562
fix merge
areibman Nov 5, 2024
91136a7
init merge
areibman Nov 5, 2024
a098eb0
updated model name so that tokencost will recognize this as a mistral…
areibman Nov 6, 2024
23fe5c4
black lint
areibman Nov 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
143 changes: 143 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -370,6 +370,149 @@ async def main() -> None:
print(message.content)


await main()
```
</details>

### Mistral 〽️

Track agents built with the Anthropic Python SDK (>=0.32.0).

- [AgentOps integration example](./examples/mistral//mistral_example.ipynb)
- [Official Mistral documentation](https://docs.mistral.ai)

<details>
<summary>Installation</summary>

```bash
pip install mistralai
```

Sync

```python python
from mistralai import Mistral
import agentops

# Beginning of program's code (i.e. main.py, __init__.py)
agentops.init(<INSERT YOUR API KEY HERE>)

client = Mistral(
# This is the default and can be omitted
api_key=os.environ.get("MISTRAL_API_KEY"),
)

message = client.chat.complete(
messages=[
{
"role": "user",
"content": "Tell me a cool fact about AgentOps",
}
],
model="open-mistral-nemo",
)
print(message.choices[0].message.content)

agentops.end_session('Success')
```

Streaming

```python python
from mistralai import Mistral
import agentops

# Beginning of program's code (i.e. main.py, __init__.py)
agentops.init(<INSERT YOUR API KEY HERE>)

client = Mistral(
# This is the default and can be omitted
api_key=os.environ.get("MISTRAL_API_KEY"),
)

message = client.chat.stream(
messages=[
{
"role": "user",
"content": "Tell me something cool about streaming agents",
}
],
model="open-mistral-nemo",
)

response = ""
for event in message:
if event.data.choices[0].finish_reason == "stop":
print("\n")
print(response)
print("\n")
else:
response += event.text

agentops.end_session('Success')
```

Async

```python python
import asyncio
from mistralai import Mistral

client = Mistral(
# This is the default and can be omitted
api_key=os.environ.get("MISTRAL_API_KEY"),
)


async def main() -> None:
message = await client.chat.complete_async(
messages=[
{
"role": "user",
"content": "Tell me something interesting about async agents",
}
],
model="open-mistral-nemo",
)
print(message.choices[0].message.content)


await main()
```

Async Streaming

```python python
import asyncio
from mistralai import Mistral

client = Mistral(
# This is the default and can be omitted
api_key=os.environ.get("MISTRAL_API_KEY"),
)


async def main() -> None:
message = await client.chat.stream_async(
messages=[
{
"role": "user",
"content": "Tell me something interesting about async streaming agents",
}
],
model="open-mistral-nemo",
)

response = ""
async for event in message:
if event.data.choices[0].finish_reason == "stop":
print("\n")
print(response)
print("\n")
else:
response += event.text


await main()
```
</details>
Expand Down
16 changes: 16 additions & 0 deletions agentops/llms/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
from .ollama import OllamaProvider
from .openai import OpenAiProvider
from .anthropic import AnthropicProvider
from .mistral import MistralProvider
from .ai21 import AI21Provider

original_func = {}
Expand Down Expand Up @@ -40,6 +41,9 @@ class LlmTracker:
"anthropic": {
"0.32.0": ("completions.create",),
},
"mistralai": {
"1.0.1": ("chat.complete", "chat.stream"),
},
"ai21": {
"2.0.0": (
"chat.completions.create",
Expand Down Expand Up @@ -142,6 +146,17 @@ def override_api(self):
f"Only Anthropic>=0.32.0 supported. v{module_version} found."
)

if api == "mistralai":
module_version = version(api)

if Version(module_version) >= parse("1.0.1"):
provider = MistralProvider(self.client)
provider.override()
else:
logger.warning(
f"Only MistralAI>=1.0.1 supported. v{module_version} found."
)

if api == "ai21":
module_version = version(api)

Expand All @@ -165,4 +180,5 @@ def stop_instrumenting(self):
LiteLLMProvider(self.client).undo_override()
OllamaProvider(self.client).undo_override()
AnthropicProvider(self.client).undo_override()
MistralProvider(self.client).undo_override()
AI21Provider(self.client).undo_override()
Loading
Loading