Skip to content

Commit

Permalink
docs: update example readmes (#160)
Browse files Browse the repository at this point in the history
  • Loading branch information
mikeldking authored Jan 31, 2024
1 parent 407252c commit 37335ca
Show file tree
Hide file tree
Showing 3 changed files with 11 additions and 0 deletions.
4 changes: 4 additions & 0 deletions js/examples/llama-index-express/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Overview

This is a [LlamaIndex](https://www.llamaindex.ai/) project bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama) and adapted to include OpenInference instrumentation for OpenAI calls.

Our example will export spans data simultaneously on `Console` and [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports.

## Getting Started With Local Development

First, startup the backend as described in the [backend README](./backend/README.md).
Expand Down
5 changes: 5 additions & 0 deletions js/examples/openai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Overview

This example shows how to use [@arizeai/openinference-instrumentation-openai](https://github.com/Arize-ai/openinference/tree/main/js/packages/openinference-instrumentation-openai) to instrument a simple Node.js application with OpenAI

Our example will export spans data simultaneously on `Console` and [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports.
2 changes: 2 additions & 0 deletions python/examples/llama-index/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

This is a [LlamaIndex](https://www.llamaindex.ai/) project bootstrapped with [`create-llama`](https://github.com/run-llama/LlamaIndexTS/tree/main/packages/create-llama) and instrumented using OpenInference.

Our example will export spans data to [arize-phoenix](https://github.com/Arize-ai/phoenix), however you can run your code anywhere and can use any exporter that OpenTelemetry supports.

## Getting Started with Local Development

First, startup the backend as described in the [backend README](./backend/README.md).
Expand Down

0 comments on commit 37335ca

Please sign in to comment.