-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
influxdbexporer creates metric exports that are not accepted by influx #29896
Comments
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
Encountering the same bug on v0.91.0 and InfluxDB 1.8, with data from the |
As an update: The regression seems to occur in the
|
I also encounter the same bug on Thanks @padraic-padraic I have tested it on |
I was able to look at some InfluxDB documentation, it looks like The solution on the collector side of things would be to update the write logic in the writer to account for the added I'm not familiar with InfluxDB, but is there a requirement to continue using v1? Looking from the outside, there haven't been any Related issues: influxdata/influxdb#17781, influxdata/influxdb-client-go#143 |
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself. |
Hi. Is there any update on this issue, or a potential workaround? Is v1 compatibility now totally off the table? Trying to insert some StatsD metrics into InfluxDB v1.8 & seeing a similar problem. |
Hello there, |
Also checking in to see if there's any progress on this issue. Despite being an 'old' version, the InfluxDB v1 API is still relevant, especially as InfluxDB themselves are advising using v1.x in preparation for the transition to the v3 open source release. |
Helps open-telemetry/opentelemetry-collector-contrib#29896 The flags field is set as an unsigned integer, which is not compatible with InfluxDB 1.x. Luckily, the field is also not useful, as it stores a value chosen from an enum containing no useful values. In this change, the flags field is removed from conversion. If it needs to be added in the future, we can expect that enough time will have passed that retention policies will have removed the prior `flags` field.
Helps open-telemetry/opentelemetry-collector-contrib#29896 The flags field is set as an unsigned integer, which is not compatible with InfluxDB 1.x. Luckily, the field is also not useful, as it stores a value chosen from an enum containing no useful values. In this change, the flags field is removed from conversion. If it needs to be added in the future, we can expect that enough time will have passed that retention policies will have removed the prior `flags` field.
I can see that using unsigned integers in the OpenTelemetry conversion is not a good idea. Fortunately, within the Metrics signal, only the otel |
This update preemptively fixes open-telemetry/opentelemetry-collector-contrib#29896 via influxdata/influxdb-observability#305
This update preemptively fixes open-telemetry/opentelemetry-collector-contrib#29896 via influxdata/influxdb-observability#305
Fixes open-telemetry#29896 This change removes the Metric signal `flags` field to/from InfluxDB conversion. The InfluxDB field is type `uint` which is not supported in InfluxDB v1 AND the Metric `flags` field is not particularly useful (yet) in OpenTelemetry.
**Description:** <Describe what has changed.> This change removes the Metric signal `flags` field to/from InfluxDB conversion. The InfluxDB field is type `uint` which is not supported in InfluxDB v1 AND the Metric `flags` field is not particularly useful (yet) in OpenTelemetry. **Link to tracking Issue:** <Issue number if applicable> Fixes #29896 **Testing:** Material code changes and tests in influxdata/influxdb-observability#305 **Documentation:** <Describe the documentation added.> Small changes to docs in influxdata/influxdb-observability#305
Component(s)
exporter/influxdb
What happened?
Description
After #27084 was fixed I run into the next issue with a v1 influxdb exporter. Every metric which is exported via the javaagent micrometer instrumentation fails to be exported via the collector influxdb exporter with a similar message:
Steps to Reproduce
Similar to #27084 but with a SB3.2 app producing the metrics via otel javaagent.
Expected Result
In general I would love if it just works. When I put the line from above into a line protocol parser it fails, it might be related to the escaping. However somehow Im not able to properly filter for tags to be added. More tags increase the cardinality in influx which is in general a bad thing. I cant really filter for each tag by name just to get more tags over time at the end. So is there a way to not send any tags apart from some that are explicitly whitelisted? I was expecting something similar to span_dimensions and log_record_dimensions to make it.
Collector version
0.91.0
Environment information
Environment
OS: otel/opentelemetry-collector-contrib:0.91.0
OpenTelemetry Collector configuration
No response
Log output
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: