Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[loki-canary] Stream label configuration #1435

Closed
theo-lubert opened this issue Dec 17, 2019 · 4 comments · Fixed by #2259
Closed

[loki-canary] Stream label configuration #1435

theo-lubert opened this issue Dec 17, 2019 · 4 comments · Fixed by #2259
Labels
good first issue These are great first issues. If you are looking for a place to start, start here! help wanted We would love help on these issues. Please come help us! keepalive An issue or PR that will be kept alive and never marked as stale.

Comments

@theo-lubert
Copy link

Is your feature request related to a problem? Please describe.

We are currently trying to get loki-canary to work with the Docker log driver in a Swarm environnement. But it seems they are simply not compatible, as loki-canary is incapable of fetching the correct labels (there are "labelname" and "labelvalue" configurations already, but one is missing for "stream").

Loki-canary expects a "stream" label (stdout/stderr), while the Docker log driver publishes this under the "source" label. Looking at the source code, it looks like both are hardcoded.

Example:

loki-canary:
  query={stream="stdout",swarm_service="monitoring_loki-canary"} => KO (Empty)

manually:
  query={source="stdout",swarm_service="monitoring_loki-canary"} => Success

Source:

loki-canary (reader.go #L106)

Docker log driver (loki.go #L60)

Describe the solution you'd like

Both projects should allow customization, starting with loki-canary to mitigate this very case.
But I think that the Docker log driver should too, in order to play nice with other implementations based on "stream" that might exist (canary, generic grafana dashboards, etc)

Describe alternatives you've considered

There are no known workarounds.
We could use promtail of course, but it would defeat the purpose of loki-canary, since we would use a different log transmission medium for the rest of the swarm (Docker log driver).

Additional context
none

@stale
Copy link

stale bot commented Jan 16, 2020

This issue has been automatically marked as stale because it has not had any activity in the past 30 days. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale A stale issue or PR that will automatically be closed. label Jan 16, 2020
@theo-lubert
Copy link
Author

This looks like a potential easy win, is there no one to have a look at it ?
I have no experience in Go

@stale stale bot removed the stale A stale issue or PR that will automatically be closed. label Jan 21, 2020
@stale
Copy link

stale bot commented Feb 20, 2020

This issue has been automatically marked as stale because it has not had any activity in the past 30 days. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale A stale issue or PR that will automatically be closed. label Feb 20, 2020
@stale stale bot closed this as completed Feb 27, 2020
@rndmh3ro
Copy link

I'm experiencing the same problem. loki-canary in combination with the docker-driver is sadly unusable with this bug.

Can we reopen this?

@slim-bean slim-bean reopened this May 7, 2020
@stale stale bot removed the stale A stale issue or PR that will automatically be closed. label May 7, 2020
@slim-bean slim-bean added keepalive An issue or PR that will be kept alive and never marked as stale. help wanted We would love help on these issues. Please come help us! good first issue These are great first issues. If you are looking for a place to start, start here! labels May 7, 2020
patrickjahns added a commit to patrickjahns/ansible-role-loki-canary that referenced this issue May 10, 2020
Currently canary is requiring a hardcoded label "stdout" (grafana/loki#1435)

When the messages from canary would be forwarded via systemds journal,
this label would be missing and/or we would require to have a very specific
labelset to get canary to work as expected

In this commit a workaround is introduced, by redirecting the output from
canary to a file and using promtails file_sd to discover and scrape this file
Additionally logrotate was used to prevent the disk from running out of space
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue These are great first issues. If you are looking for a place to start, start here! help wanted We would love help on these issues. Please come help us! keepalive An issue or PR that will be kept alive and never marked as stale.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants