From bb053f4d00c108f4bd2fc181c213700daf07b7e9 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 09:50:22 -0800 Subject: [PATCH 01/10] chore: add readme for python root --- python/README.md | 81 +++++++++++++++++++++++++++++++++++++++++++----- 1 file changed, 74 insertions(+), 7 deletions(-) diff --git a/python/README.md b/python/README.md index 924d144c0..5628bf999 100644 --- a/python/README.md +++ b/python/README.md @@ -1,16 +1,83 @@ # OpenInference Python -This is the Python version of OpenInference, a framework for collecting traces from LLM applications. +This is the Python version of OpenInference instrumentation, a framework for collecting traces from LLM applications. -## Development -This project is built using hatch. +# Getting Started +Instrumentation is the act of adding observability code to an app yourself. +If you’re instrumenting an app, you need to use the OpenTelemetry SDK for your language. You’ll then use the SDK to initialize OpenTelemetry and the API to instrument your code. This will emit telemetry from your app, and any library you installed that also comes with instrumentation. -## Publishing +# OpenAI Example +Install openinference instrumentation for OpenAI: -Publishing to PyPi is done using hatch. +```shell +pip install openinference-instrumentation-openai +``` + +This assumes that you already have OpenAI installed. If not, you can install it using: + +```shell +pip install "openai>=1.0.0" +``` +Currently only openai>=1.0.0 is supported. + +To export traces from the instrumentor to a collector, we use the OpenTelemetry SDK and the HTTP exporter. Install these using: + +```shell +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http +``` + +Below shows a simple application calling chat completions from OpenAI. + +Note that the endpoint is set to collector running on `localhost:6006`, but can be changed if you are running a collector on a different location. + +```python +import openai +from openinference.instrumentation.openai import OpenAIInstrumentor +from opentelemetry import trace as trace_api +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter +from opentelemetry.sdk import trace as trace_sdk +from opentelemetry.sdk.resources import Resource +from opentelemetry.sdk.trace.export import SimpleSpanProcessor + +# Set up the OpenTelemetry SDK tracer provider with an HTTP exporter. +# Change the endpoint if the collector is running at a different location. +endpoint = "http://localhost:6006/v1/traces" +resource = Resource(attributes={}) +tracer_provider = trace_sdk.TracerProvider(resource=resource) +span_exporter = OTLPSpanExporter(endpoint=endpoint) +span_processor = SimpleSpanProcessor(span_exporter=span_exporter) +tracer_provider.add_span_processor(span_processor=span_processor) +trace_api.set_tracer_provider(tracer_provider=tracer_provider) + +# Call the instrumentor to instrument OpenAI +OpenAIInstrumentor().instrument() + +# Run the OpenAI application. +# Make you have your API key set in the environment variable OPENAI_API_KEY. +if __name__ == "__main__": + response = openai.OpenAI().chat.completions.create( + model="gpt-3.5-turbo", + messages=[{"role": "user", "content": "Write a haiku."}], + max_tokens=20, + ) + print(response.choices[0].message.content) +``` + +## Phoenix Collector + +If you don't have collector running, you can use the Phoenix collector. Install it using: ```shell -hatch build -hatch publish -u __token__ +pip install arize-phoenix +``` + +Then before running the example abov, start the collector using: + +```python +import phoenix as px + +px.launch_app() ``` + +By default, the collector will run on `localhost:6006`. From f8250feca092ec705399f4f0ee7e4aa1fd7f7fad Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 09:51:29 -0800 Subject: [PATCH 02/10] clean up --- python/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/README.md b/python/README.md index 5628bf999..64bd0187e 100644 --- a/python/README.md +++ b/python/README.md @@ -80,4 +80,4 @@ import phoenix as px px.launch_app() ``` -By default, the collector will run on `localhost:6006`. +By default, the Phoenix collector is running on `http://localhost:6006/v1/traces`. From 7eb6442bcea0472cf0f40384a13f4f5438953fc9 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 09:51:50 -0800 Subject: [PATCH 03/10] clean up --- python/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/README.md b/python/README.md index 64bd0187e..c8ddfaacc 100644 --- a/python/README.md +++ b/python/README.md @@ -72,7 +72,7 @@ If you don't have collector running, you can use the Phoenix collector. Install pip install arize-phoenix ``` -Then before running the example abov, start the collector using: +Then before running the example above, start the collector using: ```python import phoenix as px From c2c69a2e65e55e615338061e595b2f827ae491e7 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 09:52:58 -0800 Subject: [PATCH 04/10] clean up --- python/README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/python/README.md b/python/README.md index c8ddfaacc..964c75da5 100644 --- a/python/README.md +++ b/python/README.md @@ -81,3 +81,5 @@ px.launch_app() ``` By default, the Phoenix collector is running on `http://localhost:6006/v1/traces`. + +Phoenix runs completely locally on your machine and does not collect any data over the internet. From 0ac4edfec32c93ce8717530454db4c872dbd951b Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 10:15:50 -0800 Subject: [PATCH 05/10] clean up --- python/README.md | 35 +++++++++++++++++++++-------------- 1 file changed, 21 insertions(+), 14 deletions(-) diff --git a/python/README.md b/python/README.md index 964c75da5..cbae07f5d 100644 --- a/python/README.md +++ b/python/README.md @@ -8,24 +8,27 @@ Instrumentation is the act of adding observability code to an app yourself. If you’re instrumenting an app, you need to use the OpenTelemetry SDK for your language. You’ll then use the SDK to initialize OpenTelemetry and the API to instrument your code. This will emit telemetry from your app, and any library you installed that also comes with instrumentation. # OpenAI Example -Install openinference instrumentation for OpenAI: + +To export traces from the instrumentor to a collector, install the OpenTelemetry SDK and HTTP exporter using: ```shell -pip install openinference-instrumentation-openai +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http ``` -This assumes that you already have OpenAI installed. If not, you can install it using: +Install OpenInference instrumentator for OpenAI: ```shell -pip install "openai>=1.0.0" +pip install openinference-instrumentation-openai ``` -Currently only openai>=1.0.0 is supported. -To export traces from the instrumentor to a collector, we use the OpenTelemetry SDK and the HTTP exporter. Install these using: +This assumes that you already have OpenAI>=1.0.0 installed. If not, install using: ```shell -pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http +pip install "openai>=1.0.0" ``` +Currently only openai>=1.0.0 is supported. + +## Chat Completions Below shows a simple application calling chat completions from OpenAI. @@ -66,7 +69,11 @@ if __name__ == "__main__": ## Phoenix Collector -If you don't have collector running, you can use the Phoenix collector. Install it using: +If you don't have a collector, you can try Arize Phoenix. + +Phoenix runs completely locally on your machine, and does not send any data over the internet. + +Install using: ```shell pip install arize-phoenix @@ -74,12 +81,12 @@ pip install arize-phoenix Then before running the example above, start the collector using: -```python -import phoenix as px - -px.launch_app() +```shell +python -m phoenix.server.main serve ``` -By default, the Phoenix collector is running on `http://localhost:6006/v1/traces`. +By default, the Phoenix collector is running on `http://localhost:6006`, so the example above will work without modification. + +Here's a screenshot of traces being visualized in the Phoenix UI: -Phoenix runs completely locally on your machine and does not collect any data over the internet. +![LLM Application Tracing](https://github.com/Arize-ai/phoenix-assets/blob/main/gifs/langchain_rag_stuff_documents_chain_10mb.gif?raw=true) From f216ec58ef8cc688792a8e57857897c39c0d99aa Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 10:18:18 -0800 Subject: [PATCH 06/10] clean up --- python/README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/python/README.md b/python/README.md index cbae07f5d..8b70835a8 100644 --- a/python/README.md +++ b/python/README.md @@ -79,14 +79,14 @@ Install using: pip install arize-phoenix ``` -Then before running the example above, start the collector using: +Then before running the example above, start Phoenix using: ```shell python -m phoenix.server.main serve ``` -By default, the Phoenix collector is running on `http://localhost:6006`, so the example above will work without modification. +By default, the Phoenix collector is running on `http://localhost:6006/v1/traces`, so the example above will work without modification. -Here's a screenshot of traces being visualized in the Phoenix UI: +Here's a screenshot of traces being visualized in the Phoenix UI. Visit `http://localhost:6006` on your browser. ![LLM Application Tracing](https://github.com/Arize-ai/phoenix-assets/blob/main/gifs/langchain_rag_stuff_documents_chain_10mb.gif?raw=true) From f567b5c72f76dd4f6e67655e6223d030f9a3c9b2 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 10:20:04 -0800 Subject: [PATCH 07/10] clean up --- python/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/README.md b/python/README.md index 8b70835a8..56b2cdc91 100644 --- a/python/README.md +++ b/python/README.md @@ -7,7 +7,7 @@ This is the Python version of OpenInference instrumentation, a framework for col Instrumentation is the act of adding observability code to an app yourself. If you’re instrumenting an app, you need to use the OpenTelemetry SDK for your language. You’ll then use the SDK to initialize OpenTelemetry and the API to instrument your code. This will emit telemetry from your app, and any library you installed that also comes with instrumentation. -# OpenAI Example +# Example To export traces from the instrumentor to a collector, install the OpenTelemetry SDK and HTTP exporter using: From 2a7fb4b809b4f437ee3e886e94f3867a032ea9d3 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 10:26:55 -0800 Subject: [PATCH 08/10] clean up --- python/README.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/python/README.md b/python/README.md index 56b2cdc91..b0f1b1c15 100644 --- a/python/README.md +++ b/python/README.md @@ -26,13 +26,15 @@ This assumes that you already have OpenAI>=1.0.0 installed. If not, install usin ```shell pip install "openai>=1.0.0" ``` -Currently only openai>=1.0.0 is supported. +Currently only `openai>=1.0.0` is supported. -## Chat Completions +## Application Below shows a simple application calling chat completions from OpenAI. -Note that the endpoint is set to collector running on `localhost:6006`, but can be changed if you are running a collector on a different location. +Note that the `endpoint` is set to a collector running on `localhost:6006/v1/traces`, but can be changed if you are running your collector at a different location. + +The trace collector should be started before running this example. See [Phoenix Collector](#phoenix-collector) below if you don't have a collector. ```python import openai @@ -69,9 +71,7 @@ if __name__ == "__main__": ## Phoenix Collector -If you don't have a collector, you can try Arize Phoenix. - -Phoenix runs completely locally on your machine, and does not send any data over the internet. +Phoenix runs locally on your machine and does not send data over the internet. Install using: From 44bc3c2af9c2fe73a9f782b220ebeabe565ae399 Mon Sep 17 00:00:00 2001 From: Roger Yang Date: Mon, 5 Feb 2024 10:30:42 -0800 Subject: [PATCH 09/10] clean up --- python/README.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/python/README.md b/python/README.md index b0f1b1c15..9dfc3f751 100644 --- a/python/README.md +++ b/python/README.md @@ -4,8 +4,7 @@ This is the Python version of OpenInference instrumentation, a framework for col # Getting Started -Instrumentation is the act of adding observability code to an app yourself. -If you’re instrumenting an app, you need to use the OpenTelemetry SDK for your language. You’ll then use the SDK to initialize OpenTelemetry and the API to instrument your code. This will emit telemetry from your app, and any library you installed that also comes with instrumentation. +Instrumentation is the act of adding observability code to an application. OpenInference provides instrumentors for several popular LLM frameworks and SDKs. The instrumentors emit traces from the LLM applications, and the traces can be collected by a collector, e.g. by the [Phoenix Collector](#phoenix-collector). # Example From 628b3015fd6821b45a3cc8e47b4eab1fc4b5c8f9 Mon Sep 17 00:00:00 2001 From: Roger Yang <80478925+RogerHYang@users.noreply.github.com> Date: Mon, 5 Feb 2024 10:36:26 -0800 Subject: [PATCH 10/10] add link to list of instrumentors --- python/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/python/README.md b/python/README.md index 9dfc3f751..d979ab5ae 100644 --- a/python/README.md +++ b/python/README.md @@ -4,7 +4,7 @@ This is the Python version of OpenInference instrumentation, a framework for col # Getting Started -Instrumentation is the act of adding observability code to an application. OpenInference provides instrumentors for several popular LLM frameworks and SDKs. The instrumentors emit traces from the LLM applications, and the traces can be collected by a collector, e.g. by the [Phoenix Collector](#phoenix-collector). +Instrumentation is the act of adding observability code to an application. OpenInference provides [instrumentors](https://github.com/Arize-ai/openinference?tab=readme-ov-file#python) for several popular LLM frameworks and SDKs. The instrumentors emit traces from the LLM applications, and the traces can be collected by a collector, e.g. by the [Phoenix Collector](#phoenix-collector). # Example