Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs(llamaindex-sdk): add README for LlamaIndex SDK #53

Merged
merged 8 commits into from
Nov 11, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 34 additions & 0 deletions DEVELOPER.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,40 @@
docker run -d toolbox:dev
```

## Developing Toolbox SDKs

Setting up a Development Environment:

1. Clone the repository:

```bash
git clone https://github.com/googleapis/genai-toolbox.git
```

1. Navigate to the SDK directory:

```bash
cd genai-toolbox/sdks/langchain
```

or

```bash
cd genai-toolbox/sdks/llamaindex
```

1. Install the SDK and test dependencies:

```bash
pip install -e .[test]
```

1. Run tests and/or contribute to the SDK's development.

```bash
pytest
```

## CI/CD Details

Cloud Build is used to run tests against Google Cloud resources in test project.
Expand Down
23 changes: 22 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,9 +99,30 @@ Once you've installed the Toolbox LangChain SDK, you can load tools:

```python
from toolbox_langchain_sdk import ToolboxClient
from aiohttp import ClientSession

# update the url to point to your server
client = ToolboxClient("http://127.0.0.1/")
session = ClientSession()
client = ToolboxClient("http://127.0.0.1:5000", session)

# these tools can be passed to your application!
tools = await client.load_toolset()
```

</details>

<details open>

<summary>LlamaIndex</summary>
Once you've installed the Toolbox LlamaIndex SDK, you can load tools:

```python
from toolbox_llamaindex_sdk import ToolboxClient
from aiohttp import ClientSession

# update the url to point to your server
session = ClientSession()
client = ToolboxClient("http://127.0.0.1:5000", session)

# these tools can be passed to your application!
tools = await client.load_toolset()
Expand Down
76 changes: 76 additions & 0 deletions sdks/llamaindex/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# GenAI Toolbox SDK

This SDK allows you to seamlessly integrate the functionalities of [Toolbox](https://github.com/googleapis/genai-toolbox) into your LLM applications, enabling advanced orchestration and interaction with GenAI models.

<!-- TOC ignore:true -->
## Table of Contents
<!-- TOC -->

- [Installation](#installation)
- [Usage](#usage)
- [Load a toolset](#load-a-toolset)
- [Use with LlamaIndex](#use-with-llamaindex)
- [Manual usage](#manual-usage)

<!-- /TOC -->

## Installation

You can install the Toolbox SDK for LlamaIndex using `pip`.

```bash
pip install toolbox-llamaindex-sdk
```

> [!IMPORTANT]
> This SDK is not yet available on PyPI. For now, install it from source by following these [instructions](/DEVELOPER.md#developing-toolbox-SDKs).

## Usage

Import and initialize the toolbox client.

```python
from toolbox_llamaindex_sdk import ToolboxClient
from aiohttp import ClientSession

session = ClientSession()
# Replace with your Toolbox service's URL
toolbox = ToolboxClient("http://127.0.0.1:5000", session)
```

## Load a toolset

You can load a toolsets, that are collections of related tools.

```python
# Load all tools
tools = await toolbox.load_toolset()

# Load a specific toolset
tools = await toolbox.load_toolset("my-toolset")
```

## Use with LlamaIndex

LlamaIndex agents can dynamically choose and execute tools based on the user input. The user can include the tools loaded from the Toolbox SDK in the agent's toolkit.

```python
from llama_index.llms.vertex import Vertex
from llama_index.core.agent import ReActAgent

model = Vertex(model="gemini-1.5-pro")

# Initialize agent with tools
agent = ReActAgent.from_tools(tools, llm=model, verbose=True)

# Query the agent
response = agent.query("Get some response from the agent.")
```

## Manual usage

You can also execute a tool manually using the `acall` method.

```python
result = await tools[0].acall({"param1": "value1", "param2": "value2"})
```
2 changes: 1 addition & 1 deletion sdks/llamaindex/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[project]
name = "toolbox_llamaindex_sdk"
name = "toolbox-llamaindex-sdk"
version="0.0.1"
description = "Python SDK for interacting with the Toolbox service with Llamaindex"
license = {file = "LICENSE"}
Expand Down
Loading