diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 7d760a9ef9174..cfcaedeab7410 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -200,7 +200,7 @@ datadog_checks_base/tests/**/test_db_statements.py @DataDog/database-monitoring # APM Integrations /langchain/ @DataDog/ml-observability @DataDog/agent-integrations @DataDog/documentation /openai/ @DataDog/ml-observability @DataDog/agent-integrations @DataDog/documentation - +/anthropic/ @DataDog/ml-observability @DataDog/agent-integrations @DataDog/documentation # Windows agent datadog_checks_base/datadog_checks/base/checks/win/ @DataDog/windows-agent @DataDog/agent-integrations diff --git a/.github/workflows/config/labeler.yml b/.github/workflows/config/labeler.yml index 580742a589edf..157f9d601aebc 100644 --- a/.github/workflows/config/labeler.yml +++ b/.github/workflows/config/labeler.yml @@ -317,6 +317,8 @@ integration/kyverno: - kyverno/**/* integration/langchain: - langchain/**/* +integration/anthropic: +- anthropic/**/* integration/lastpass: - lastpass/**/* integration/lighttpd: diff --git a/anthropic/CHANGELOG.md b/anthropic/CHANGELOG.md new file mode 100644 index 0000000000000..3de181c7e0654 --- /dev/null +++ b/anthropic/CHANGELOG.md @@ -0,0 +1,7 @@ +# CHANGELOG - Anthropic + +## 1.0.0 / 2024-11-08 + +***Added***: + +* Initial Release diff --git a/anthropic/README.md b/anthropic/README.md new file mode 100644 index 0000000000000..c15b4d9be6d33 --- /dev/null +++ b/anthropic/README.md @@ -0,0 +1,127 @@ +# Anthropic + +## Overview +Use the Anthropic integration to monitor, troubleshoot, and evaluate your LLM-powered applications, such as chatbots or data extraction tools, using Anthropic's models. + +If you are building LLM applications, use LLM Observability to investigate the root cause of issues, +monitor operational performance, and evaluate the quality, privacy, and safety of your LLM applications. + +See the [LLM Observability tracing view video](https://imgix.datadoghq.com/video/products/llm-observability/expedite-troubleshooting.mp4?fm=webm&fit=max) for an example of how you can investigate a trace. + +## Setup + +### LLM Observability: Get end-to-end visibility into your LLM application using Anthropic +You can enable LLM Observability in different environments. Follow the appropriate setup based on your scenario: + +#### Installation for Python + +##### If you do not have the Datadog Agent: +1. Install the `ddtrace` package: + + ```shell + pip install ddtrace + ``` + +2. Start your application using the following command to enable Agentless mode: + + ```shell + DD_SITE= DD_API_KEY= DD_LLMOBS_ENABLED=1 DD_LLMOBS_AGENTLESS_ENABLED=1 DD_LLMOBS_ML_APP= ddtrace-run python .py + ``` + +##### If you already have the Datadog Agent installed: +1. Make sure the Agent is running and that APM and StatsD are enabled. For example, use the following command with Docker: + + ```shell + docker run -d \ + --cgroupns host \ + --pid host \ + -v /var/run/docker.sock:/var/run/docker.sock:ro \ + -v /proc/:/host/proc/:ro \ + -v /sys/fs/cgroup/:/host/sys/fs/cgroup:ro \ + -e DD_API_KEY= \ + -p 127.0.0.1:8126:8126/tcp \ + -p 127.0.0.1:8125:8125/udp \ + -e DD_DOGSTATSD_NON_LOCAL_TRAFFIC=true \ + -e DD_APM_ENABLED=true \ + gcr.io/datadoghq/agent:latest + ``` + +2. If you haven't already, install the `ddtrace` package: + + ```shell + pip install ddtrace + ``` + +3. To automatically enable tracing, start your application using the `ddtrace-run` command: + + ```shell + DD_SITE= DD_API_KEY= DD_LLMOBS_ENABLED=1 DD_LLMOBS_ML_APP= ddtrace-run python .py + ``` + +**Note**: If the Agent is running on a custom host or port, set `DD_AGENT_HOST` and `DD_TRACE_AGENT_PORT` accordingly. + +##### If you are running LLM Observability in a serverless environment (AWS Lambda): +1. Install the **Datadog-Python** and **Datadog-Extension** Lambda layers as part of your AWS Lambda setup. +2. Enable LLM Observability by setting the following environment variables: + + ```shell + DD_SITE= DD_API_KEY= DD_LLMOBS_ENABLED=1 DD_LLMOBS_ML_APP= + ``` + +**Note**: In serverless environments, Datadog automatically flushes spans at the end of the Lambda function. + +##### Automatic Anthropic tracing + +The Anthropic integration allows for automatic tracing of chat message calls made by the Anthropic Python SDK, capturing latency, errors, input/output messages, and token usage during Anthropic operations. + +The following methods are traced for both synchronous and asynchronous Anthropic operations: +- Chat messages (including streamed calls): `Anthropic().messages.create()`, `AsyncAnthropic().messages.create()` +- Streamed chat messages: `Anthropic().messages.stream()`, `AsyncAnthropic().messages.stream()` + +No additional setup is required for these methods. + +##### Validation + +Validate that LLM Observability is properly capturing spans by checking your application logs for successful span creation. You can also run the following command to check the status of the `dd-trace` integration: + + ```shell + ddtrace-run --info + ``` + +Look for the following message to confirm the setup: + + ```shell + Agent error: None + ``` + +##### Debugging + +If you encounter issues during setup, enable debug logging by passing the `--debug` flag: + + ```shell + ddtrace-run --debug + ``` + +This displays any errors related to data transmission or instrumentation, including issues with Anthropic traces. + +## Data Collected + +### Metrics + +The Anthropic integration does not include any custom metrics. + +### Service Checks + +The Anthropic integration does not include any service checks. + +### Events + +The Anthropic integration does not include any events. + +## Troubleshooting + +Need help? Contact [Datadog support][2]. + +[1]: https://docs.datadoghq.com/integrations/anthropic/ +[2]: https://docs.datadoghq.com/help/ + diff --git a/anthropic/assets/service_checks.json b/anthropic/assets/service_checks.json new file mode 100644 index 0000000000000..fe51488c7066f --- /dev/null +++ b/anthropic/assets/service_checks.json @@ -0,0 +1 @@ +[] diff --git a/anthropic/manifest.json b/anthropic/manifest.json new file mode 100644 index 0000000000000..082d9d8c046be --- /dev/null +++ b/anthropic/manifest.json @@ -0,0 +1,43 @@ +{ + "manifest_version": "2.0.0", + "app_uuid": "53fe7c3e-57eb-42ca-8e43-ec92c04b6160", + "app_id": "anthropic", + "display_on_public_website": true, + "tile": { + "overview": "README.md#Overview", + "configuration": "README.md#Setup", + "support": "README.md#Support", + "changelog": "CHANGELOG.md", + "description": "Monitor Anthropic usage and health at the application level", + "title": "Anthropic", + "media": [], + "classifier_tags": [ + "Category::AI/ML", + "Category::Metrics", + "Submitted Data Type::Traces", + "Supported OS::Linux", + "Supported OS::Windows", + "Supported OS::macOS", + "Offering::Integration" + ] + }, + "assets": { + "integration": { + "auto_install": false, + "source_type_id": 31102434, + "source_type_name": "Anthropic", + "events": { + "creates_events": false + }, + "service_checks": { + "metadata_path": "assets/service_checks.json" + } + } + }, + "author": { + "support_email": "help@datadoghq.com", + "name": "Datadog", + "homepage": "https://www.datadoghq.com", + "sales_email": "info@datadoghq.com" + } +}