diff --git a/js/examples/llama-index-express/README.md b/js/examples/llama-index-express/README.md index 98a915e36..82f4b8a51 100644 --- a/js/examples/llama-index-express/README.md +++ b/js/examples/llama-index-express/README.md @@ -1,5 +1,9 @@ +# Overview + This is a [LlamaIndex](https://www.llamaindex.ai/) project bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama) and adapted to include OpenInference instrumentation for OpenAI calls. +Our example will export spans data simultaneously on `Console` and [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports. + ## Getting Started With Local Development First, startup the backend as described in the [backend README](./backend/README.md). diff --git a/js/examples/openai/README.md b/js/examples/openai/README.md new file mode 100644 index 000000000..c32cacdcc --- /dev/null +++ b/js/examples/openai/README.md @@ -0,0 +1,5 @@ +# Overview + +This example shows how to use [@arizeai/openinference-instrumentation-openai](https://github.com/Arize-ai/openinference/tree/main/js/packages/openinference-instrumentation-openai) to instrument a simple Node.js application with OpenAI + +Our example will export spans data simultaneously on `Console` and [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports. diff --git a/python/examples/llama-index/README.md b/python/examples/llama-index/README.md index 9f624ee66..7dba9c729 100644 --- a/python/examples/llama-index/README.md +++ b/python/examples/llama-index/README.md @@ -2,6 +2,8 @@ This is a [LlamaIndex](https://www.llamaindex.ai/) project bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama) and instrumented using OpenInference. +Our example will export spans data to [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports. + ## Getting Started with Local Development First, startup the backend as described in the [backend README](./backend/README.md).