Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exporter metrics (otelcol_exporter_queue_capacity, otelcol_exporter_queue_size) are generated only for a single exporter on otel-contrib 0.96.0 #31645

Closed
tqi-raurora opened this issue Mar 7, 2024 · 3 comments
Labels
bug Something isn't working collector-telemetry

Comments

@tqi-raurora
Copy link

tqi-raurora commented Mar 7, 2024

Component(s)

No response

What happened?

Description

On version 0.96.0, when setting 2 different exporters, then querying localhost:8888/metrics, only one exporter is shown on the metrics otelcol_exporter_queue_capacity and otelcol_exporter_queue_capacity.
The exporter that is shown appears to be random, and can change if the service is restarted.
On version 0.92.0 this was working as expected, with metrics for both exporters being shown.

Steps to Reproduce

Create a configuration with 2 exporters (example: 2 otlp exporters)
Query local telemetry endpoint at localhost:8888/metrics

Expected Result

Both endpoints should be shown.
Example:

tqi_raurora@ni-26829-4p:~$ otelcol-contrib --version
otelcol-contrib version 0.92.0
tqi_raurora@ni-26829-4p:~$ curl -s 0:8888/metrics | grep ^otelcol_exporter | grep queue
otelcol_exporter_queue_capacity{exporter="otlp/jaeger-1",service_instance_id="721c4da6-9e52-4119-99e5-148e88e1dba6",service_name="otelcol-contrib",service_version="0.92.0"} 1000
otelcol_exporter_queue_capacity{exporter="otlp/jaeger-2",service_instance_id="721c4da6-9e52-4119-99e5-148e88e1dba6",service_name="otelcol-contrib",service_version="0.92.0"} 1000
otelcol_exporter_queue_size{exporter="otlp/jaeger-1",service_instance_id="721c4da6-9e52-4119-99e5-148e88e1dba6",service_name="otelcol-contrib",service_version="0.92.0"} 0
otelcol_exporter_queue_size{exporter="otlp/jaeger-2",service_instance_id="721c4da6-9e52-4119-99e5-148e88e1dba6",service_name="otelcol-contrib",service_version="0.92.0"} 0

Actual Result

Only metrics for one of the exporter is shown:

tqi_raurora@ni-26829-4p:~$ otelcol-contrib --version
otelcol-contrib version 0.96.0
tqi_raurora@ni-26829-4p:~$ curl -s 0:8888/metrics | grep ^otelcol_exporter | grep queue
otelcol_exporter_queue_capacity{exporter="otlp/jaeger-1",service_instance_id="c9c9dd6a-a367-42d1-a7c7-633c47b51a0a",service_name="otelcol-contrib",service_version="0.96.0"} 1000
otelcol_exporter_queue_size{exporter="otlp/jaeger-1",service_instance_id="c9c9dd6a-a367-42d1-a7c7-633c47b51a0a",service_name="otelcol-contrib",service_version="0.96.0"} 0

Collector version

0.96.0

Environment information

No response

OpenTelemetry Collector configuration

receivers:
  otlp/receiver:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318


exporters:
  otlp/jaeger-1:
    endpoint: 0.0.0.0:14317
    tls:
      insecure: true
  otlp/jaeger-2:
    endpoint: 0.0.0.0:14317
    tls:
      insecure: true


service:
  pipelines:
      
    traces:
      receivers: [otlp/receiver]
      processors: []
      exporters: [otlp/jaeger-1, otlp/jaeger-2]

Log output

No response

Additional context

No response

@tqi-raurora tqi-raurora added bug Something isn't working needs triage New item requiring triage labels Mar 7, 2024
@crobert-1
Copy link
Member

Hello @tqi-raurora, thanks for reporting and for including so much information! I was able to reproduce. This is actually an issue in the core collector repository, could you file this there instead?

@tqi-raurora
Copy link
Author

@crobert-1
Sure thing, I filed it in the core repository: open-telemetry/opentelemetry-collector#9745

@crobert-1
Copy link
Member

Thank you!

Closing in favor of the core issue.

@crobert-1 crobert-1 closed this as not planned Won't fix, can't repro, duplicate, stale Mar 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working collector-telemetry
Projects
None yet
Development

No branches or pull requests

2 participants