-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Python][DSPy] trace LM Calls #124
Comments
🤖 Sure, I can guide you on how to add instrumentation for the language model calls in Python using DSPy in the OpenInference repository. Firstly, you need to import the necessary modules. This includes Next, you need to set up the tracing provider. This involves creating a After setting up the tracing provider, you need to instrument the OpenAI library. This is done by creating an Finally, you can make the language model call. This is done by creating an Here is a code snippet that demonstrates these steps: import openai
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
span_exporter = OTLPSpanExporter(endpoint="http://127.0.0.1:6006/v1/traces")
span_processor = SimpleSpanProcessor(span_exporter=span_exporter)
tracer_provider.add_span_processor(span_processor=span_processor)
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
OpenAIInstrumentor().instrument()
if __name__ == "__main__":
response = openai.OpenAI().embeddings.create(
model="text-embedding-ada-002",
input="hello world",
)
print(response.data[0].embedding) This code will instrument the OpenAI library and trace the language model call, sending the trace data to the specified endpoint. You can find more details in the embeddings.py example in the OpenInference repository. As for the I hope this helps! If you have any further questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Add instrumentation for the language model calls
The text was updated successfully, but these errors were encountered: