Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[project-monitoring] Summary dashboard has no metrics for Disk I/O #59

Closed
eumel8 opened this issue Feb 23, 2023 · 4 comments
Closed

[project-monitoring] Summary dashboard has no metrics for Disk I/O #59

eumel8 opened this issue Feb 23, 2023 · 4 comments

Comments

@eumel8
Copy link

eumel8 commented Feb 23, 2023

Describe the bug
On all Project Monitoring installations the graph for Disk I/O are empty. The underlying dashboard Rancher/Pod has also no metrics. Prometheus has some metrics for container_fs_writes_bytes_total and container_fs_read_bytes_totak, but it seems there are some changes , linked with this pr, also mentioned in this OpenShift Issue

To Reproduce
I couldn't full reproduce the issue and what caused the empty metrics exactly. Maybe it needs a device-selector, optional a newer version from upstream (prometheus-federator 0.2.1)

Result
empty metrics

Expected Result
metrics for disc io on all Kubernetes and Rancher dashboard

SURE-6055

@MKlimuszka
Copy link
Contributor

We can check to see if there is an upstream issue and link it, but beyond that it is out of our control.

@bashofmann
Copy link
Contributor

I think this may be fixed with rancher-monitoring 102.0.0+up40.1.2. I tested it with RKE1 v1.24.10.

@eumel8 Can you check if this is fixed with new monitoring/prometheus-federator version for you as well?

@aiyengar2
Copy link

Probably a dup of rancher/rancher#38934, which was ported into the latest rancher-monitorng chart (which was brought into prometheus-federator as well).

@eumel8 feel free to open up a new issue if you are still seeing this, closing this out since it is probably fixed

@eumel8
Copy link
Author

eumel8 commented Apr 13, 2023

seems fixed on the updated 102.0.0+up40.1.2 as of today on our clusters. thx, @aiyengar2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants