Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: opentelemetry POC not compatible with latest #11993

Closed
1 task done
codefromthecrypt opened this issue Jan 13, 2025 · 1 comment · Fixed by #12007
Closed
1 task done

[Bug]: opentelemetry POC not compatible with latest #11993

codefromthecrypt opened this issue Jan 13, 2025 · 1 comment · Fixed by #12007
Labels
bug Something isn't working

Comments

@codefromthecrypt
Copy link
Contributor

Your current environment

The output of `python collect_env.py`
PyTorch version: 2.5.1
Is debug build: False
CUDA used to build PyTorch: None
ROCM used to build PyTorch: N/A

OS: macOS 15.2 (arm64)
GCC version: Could not collect
Clang version: 16.0.0 (clang-1600.0.26.6)
CMake version: version 3.30.0
Libc version: N/A

Python version: 3.12.8 (main, Dec 27 2024, 10:28:13) [Clang 16.0.0 (clang-1600.0.26.6)] (64-bit runtime)
Python platform: macOS-15.2-arm64-arm-64bit
Is CUDA available: False
CUDA runtime version: No CUDA
CUDA_MODULE_LOADING set to: N/A
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Apple M3 Pro

Versions of relevant libraries:
[pip3] numpy==1.26.4
[pip3] pyzmq==26.2.0
[pip3] torch==2.5.1
[pip3] torchvision==0.20.1
[pip3] transformers==4.48.0
[conda] numpy                     2.1.3                    pypi_0    pypi
ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: 0.6.6.post2.dev188+gd14e98d9
vLLM Build Flags:
CUDA Archs: Not Set; ROCm: Disabled; Neuron: Disabled
GPU Topology:
Could not collect

LD_LIBRARY_PATH=/Users/adriancole/oss/vllm/.venv/lib/python3.12/site-packages/cv2/../../lib:

Model Input Dumps

No response

🐛 Describe the bug

I tried to install this with latest opentelemetry, specifically I add otel requirements with EDOT bootstrap instead of the older versions in the POC https://docs.vllm.ai/en/latest/getting_started/examples/opentelemetry.html

pip install elastic-opentelemetry
edot-bootstrap >> otel-requirements.txt
pip install -r otel-requirements.txt

When I run with opentelemetry-instrument, things are ok, just I am missing the span this project creates

dotenv run -- opentelemetry-instrument vllm serve Qwen/Qwen2.5-0.5B

To get the span (since our instrumentation isn't configured as an entrypoint..), I need to add an arg. If I do, it crashes like this

$ dotenv run -- opentelemetry-instrument vllm serve Qwen/Qwen2.5-0.5B --otlp-traces-endpoint=http://localhost:4318/v1/traces
--snip--
ERROR 01-13 15:42:51 engine.py:387] ModuleNotFoundError: No module named 'opentelemetry.semconv_ai'
ERROR 01-13 15:42:51 engine.py:387] 
Process SpawnProcess-1:
Traceback (most recent call last):
  File "/Users/adriancole/.pyenv/versions/3.12.8/lib/python3.12/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/Users/adriancole/.pyenv/versions/3.12.8/lib/python3.12/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/Users/adriancole/oss/vllm/vllm/engine/multiprocessing/engine.py", line 389, in run_mp_engine
    raise e
  File "/Users/adriancole/oss/vllm/vllm/engine/multiprocessing/engine.py", line 378, in run_mp_engine
    engine = MQLLMEngine.from_engine_args(engine_args=engine_args,
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adriancole/oss/vllm/vllm/engine/multiprocessing/engine.py", line 116, in from_engine_args
    engine_config = engine_args.create_engine_config(usage_context)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adriancole/oss/vllm/vllm/engine/arg_utils.py", line 1236, in create_engine_config
    observability_config = ObservabilityConfig(
                           ^^^^^^^^^^^^^^^^^^^^
  File "<string>", line 6, in __init__
  File "/Users/adriancole/oss/vllm/vllm/config.py", line 2523, in __post_init__
    raise ValueError(
ValueError: OpenTelemetry is not available. Unable to configure 'otlp_traces_endpoint'. Ensure OpenTelemetry packages are installed. Original error:
Traceback (most recent call last):
  File "/Users/adriancole/oss/vllm/vllm/tracing.py", line 19, in <module>
    from opentelemetry.semconv_ai import SpanAttributes as BaseSpanAttributes
ModuleNotFoundError: No module named 'opentelemetry.semconv_ai'

So, I think the main thing is that the code needs to adjust for latest packages (and also the instructions)

cc @gyliu513

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@codefromthecrypt
Copy link
Contributor Author

#12007 should fix this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant