Lightweight library demonstrating how to create agenting application without using any specific framework.
This project framework provides the following features:
- Multi-agent chat with several orchestration options
- Dynamic routing (including option to look for available tools to decide)
- Beforehand planning with optional repetition via feedback loop
- Agent state management
- Custom stop conditions
- Interactive or unattended user input
- Chat resumability
- Function calling on agents
- Constrained agent routing
- Sub-workflows
- Simple RAG via function calls
- Image input support
- Ability to run pre and post steps via Sequence
- Conversation context "hidden" variables, which are not displayed to the user but agents can read and write to access additional information
- Usage metrics tracking per conversation, plus internal log for debuggability
- Multiple strategies for agent to filter conversation messages (All, last N, top K and Last N, summarize, etc..)
- LLMLingua (
extras
module) support to compress system prompts via strategies - LLM support for Structured Output
- DAPR integration
- Native submodule to host
Askable
andWorkflow
as Dapr Actors - Dapr PubSub integration for
Workflow
to enable- Event sourcing
- Decoupled communication between workflows
- Native submodule to host
- Multi-agent chat with multiple users
- Dapr PubSub integrations allows to move from one-to-many to many-to-many conversations, with different
User
instances impersonating different user profiles - Demo Repo (Coming soon)
- Dapr PubSub integrations allows to move from one-to-many to many-to-many conversations, with different
- Remoting support ((
remote
module)), allowing agents to be run on a remote server and accessed elsewhere- REST and gRPC channels supported
- Default implementation to run hosts with agent discovery and registration
- Generated Code execution locally and via ACA Dynamic Sessions
- Streaming support, even over REST or gRPC agents
- Plugins
- Azure AI Search plugin
- DB plugin
- API plugin
Python 3.11 or later is required to run this project.
git clone https://github.com/Azure-Samples/vanilla-aiagents
cd "vanilla-aiagents"
# Create a virtual environment
python -m venv .venv
# Activate the virtual environment
# On Windows
.\.venv\Scripts\activate
# On Unix or MacOS
source .venv/bin/activate
# Install the required dependencies
pip install -r requirements.txt
# Clone .env.sample to .env and update the values
cp .env.sample .env
Here is a simple example of how to use the framework:
import os
from vanilla_aiagents.llm import AzureOpenAILLM
from vanilla_aiagents.agent import Agent
from vanilla_aiagents.team import Team
from vanilla_aiagents.workflow import Workflow
llm = AzureOpenAILLM({
"azure_deployment": os.getenv("AZURE_OPENAI_MODEL"),
"azure_endpoint": os.getenv("AZURE_OPENAI_ENDPOINT"),
"api_key": os.getenv("AZURE_OPENAI_KEY"),
"api_version": os.getenv("AZURE_OPENAI_API_VERSION"),
})
# Initialize agents and team
sales = Agent(id="sales", llm=llm, description="A sales agent", system_message="""
You are a sales assistant. You provide information about our products and services.
# PRODUCTS
- Product 1: $100, description
- Product 2: $200, description
- Product 3: $300, description
""")
support = Agent(id="support", llm=llm, description="A support agent", system_message="""
You are a support assistant. You provide help with technical issues and account management.
# SUPPORT GUIDELINES
- For technical issues, please provide the following information: ...
- For account management, please provide the following information: ...
""")
team = Team(id="team", description="Contoso team", members=[sales, support], llm=llm)
# Create a workflow
workflow = Workflow(askable=team)
# Run the workflow
result = workflow.run("Hello, I'd like to know more about your products.")
print(workflow.conversation.messages)
This module provides a way to run agents on a remote server and access them elsewhere. It includes support for REST and gRPC channels, as well as a default implementation to run hosts with agent discovery and registration.
Additionally, it features Dapr integration, allowing for the hosting of Askable
and Workflow
as Dapr Actors, enabling event sourcing and decoupled communication between workflows.
See the Remote Agents documentation and Actors documentation for more information.
This module provides additional features to enhance the functionality of the framework. It includes support for:
LLMLingua
to compress system prompts
notebooks
folder contains a few demo notebooks that demonstrate how to use the framework in various scenarios.
To run the tests, execute the following command:
invoke test
To run run selected tests, execute the following command:
invoke test --test-case <test_case>
Testing also includes code coverage.
To build the project, run the following command:
invoke build --version <version>
Output wheel will be available in the dist
folder under vanilla_aiagents
with naming vanilla_aiagents-<version>-py3-none-any.whl
.
This project is licensed under the MIT License - see the LICENSE file for details.
We welcome contributions! Please see CONTRIBUTING.md for details on how to contribute.