This README provides a comprehensive guide for deploying and developing an Accelerator bot. The Accelerator allows organizations to swiftly launch a bot on their website, assisting users in finding answers to questions relating to the website's content.
Follow the subsequent steps in the given order to ensure a smooth deployment.
- Azure subscription
- Azure resource group
- Azure machine learning workspace
- Azure OpenAI service with gpt4-turbo model deployed with deployment name as "gpt4-turbo"
- Azure Bing search resource How-to create
- Git clone this repo locally
- az cli installed - how-to
Follow these steps to deploy a single node online endpoint that will host your LLM application developed using PromptFlow:
- Navigate to Azure Machine Learning workspace studio -> PromptFlow.
- Within PromptFlow menu, select Connections.
- Create an Azure OpenAI connection and name it "gpt4conn". If you'd like to provide an existing connection edit the flow.dag.yaml file and replace "gpt4conn" with your connection name and "gpt4-turbo" with your deployment name.
- Provide relevant details i.e. API_BASE, API_KEY for the connection. More details here.
- Create a Custom Connection named "BING_SEARCH" with a secret key named "bingapikey". Provide the bing subscription key as value.
- Navigate to the repo folder on your local machine.
- Modify the deployment/endpoint.yaml file and provide an endpoint name.
- Modify the deployment/deployment.yaml file, find the variable PRT_CONFIG_OVERRIDE and provide subscription-id, resource-group, workspace-name, endpoint-name, endpoint-deployment-name.
- Execute the following commands, replace the <> with values:
cd ./promptflow/deployment az ml online-endpoint create --file .\\endpoint.yml --resource-group <rg_name> --workspace-name <workspace_name> --subscription <sub_id> az ml online-deployment create --file .\\deployment.yaml --resource-group <rg_name> --workspace-name <workspace_name> --subscription <sub_id>
- Go to AOAI resource IAM and assign OpenAI User role to the Managed Online endpoint created above.
Follow these steps to deploy the BOT that will interact with the endpoint you deployed above. The BOT can then be integrated with your website.
Refer to instructions in README.
- Azure OpenAI
- Azure Machine Learning
- Azure Bot Framework
- Azure CosmosDB
- Bot Framework Emulator
- Install dependencies:
python -m venv .venv pip install -r requirements.txt
- Create connections:
- BING_SEARCH =
pf connection create -f connection_yaml\\bing_search.yaml
- Edit the azure_openai.yaml file and replace the api_base with the correct value. Run AZURE_OPENAI =
pf connection create -f connection_yaml\\azure_openai.yaml
- BING_SEARCH =
- Edit flow.dag.yaml, head over to the input section and add values to the following inputs:
- organization: Name of the Organization or Website
- organization_urls: [list of urls you want the bot to search, comma separated list]
- categories: Categories you want the bot to answer e.g. PromptFlow, Azure OpenAI (This is a string)
- Execute the flow:
pf flow test --flow ./flow.dag.yaml --interactive --multi-modal
- Login to the Azure Portal and navigate Azure Machine Learning.
- Create a compute instance.
- Create a promptflow runtime.
- Create a Custom Connection and name it BING_SEARCH. Provide the following keys and values:
- bingapikey (Please select the secret checkbox)
- Create an Azure OpenAI connection named gpt4conn.
- Navigate to the flow and start testing.
- You will need the promtpflow deployed as an endpoint to test the bot locally.
- Set environment variables:
- COSMOS_DB_URI
- COSMOS_DB_PRIMARY_KEY
- COSMOS_DB_DATABASE_ID
- COSMOS_DB_CONTAINER_ID
- LLM_APP_ENDPOINT : URL of the managed online endpoint where the promptflow is deployed
- CATEGORIES : The categories you want the bot to answer e.g. Adult social care, Council tax
- ORGANIZATION : Name of the organization the bot is designed for
- ORGANIZATION_URLS : Comma-separated string of URLs you want the bot to search to find answers related to categories above. e.g. "www.microsoft.com,www.xbox.com". NOTE: This has to be a string
- LLM_API_KEY : API key to access the LLM endpoint
- ADD_COSMOS_MEMORY : Set to "true" if you would like to use Cosmos as a memory to store all bot conversation
- WELCOME_MESSAGE : Welcome message to be displayed when the BOT starts
- Execute the app.py
python app.py
- Make note of the URL, ideally it is (http://localhost:3978/api/messages)
- Open Bot Framework Emulator and connect to your bot and start chatting.