Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loki Regular expression query ending up with error #3322

Closed
mnadeem opened this issue Feb 11, 2021 · 5 comments
Closed

Loki Regular expression query ending up with error #3322

mnadeem opened this issue Feb 11, 2021 · 5 comments

Comments

@mnadeem
Copy link

mnadeem commented Feb 11, 2021

Describe the bug

I am displaying loki logs in grafana, in the range of past two days

And getting the following error
image

There is hardly hundred lines matching the above criteria,

I see the following interesting in log file

duration=40.4314363s status=200 throughput=439MB total_bytes=18GB

Message in Grafana

t=2021-02-11T08:01:43+0000 lvl=eror msg="Data proxy error" logger=data-proxy-log userId=2 orgId=1 uname=mnadeem path=/api/datasources/proxy/10/loki/api/v1/query_range remote_addr=10.204.120.7 referer="https://grafana-app.org.com/explore?orgId=1&left=%5B%22now-2d%22,%22now%22,%22Loki-oc-proj-Dev%22,%7B%22expr%22:%22%7Bjob!%3D%5C%22oc-app-org-app1-*%5C%22%7D%20%7C%3D%20%5C%22177017268%5C%22%22%7D%5D " error="http: proxy error: EOF"
FileLogWriter("/var/log/grafana/grafana.log"): close /var/log/grafana/grafana.log: file already closed

Message in Loki

level=info ts=2021-02-11T08:01:43.43369066Z caller=metrics.go:83 org_id=oc-project traceID=7508cd62f4f5e0ff latency=slow query="{job!=\"oc-app-org-hsc-*\"} |= \"177017268\"" query_type=filter range_type=range length=48h0m1s step=1m0s duration=40.4314363s status=200 throughput=439MB total_bytes=18GB
2021-02-11 08:01:43.434646 I | http: superfluous response.WriteHeader call from github.com/opentracing-contrib/go-stdlib/nethttp.(*statusCodeTracker).WriteHeader (status-code-tracker.go:17)
level=warn ts=2021-02-11T08:01:43.434732836Z caller=logging.go:60 traceID=7508cd62f4f5e0ff msg="GET /loki/api/v1/query_range?direction=BACKWARD&limit=1000&query=%7Bjob!%3D%22oc-project-hsc-*%22%7D%20%7C%3D%20%22177017268%22&start=1612857660000000000&end=1613030461000000000&step=60 40.433112254s, error: write tcp 10.129.75.67:3100->10.130.72.1:38744: i/o timeout ws: false; Accept: application/json, text/plain, */*; Accept-Encoding: gzip, deflate, br; Accept-Language: en-US,en;q=0.9; Dnt: 1; Forwarded: for=10.204.120.7;host=grafana-app.org.com;proto=https;proto-version=; Sec-Fetch-Dest: empty; Sec-Fetch-Mode: cors; Sec-Fetch-Site: same-origin; User-Agent: Grafana/7.4.0; X-Forwarded-For: 10.204.120.7, 10.129.50.1, 10.129.50.1; X-Grafana-Org-Id: 1; X-Scope-Orgid: oc-project; "

Expected behavior
Should display the logs inclemently, instead of trying to download entire GBs of data

Environment:

  • Infrastructure: [e.g., Kubernetes, Openshift] grafana/loki:2.1.0
  • Deployment tool: [e.g., Jenkins CI/CD]

Screenshots, Promtail config, or terminal output
If applicable, add any output to help explain your problem.

@cyriltovena
Copy link
Contributor

There is hardly hundred lines matching the above criteria,

Doesn't really matter, Loki does not index everything what matter is your query selector which.

You query is asking for everything except something job are "oc-app.....". Provide a better label matcher to halve the data Loki has to process to fulfill your request.

Something like {cluster="foo", job="bar"} would be better. We're actually removing the ability to no provide an equality matcher soon. See #3216

As you can see it resulted in 18GB of data to processed.

The error message is because you reached a timeout.

Should display the logs inclemently, instead of trying to download entire GBs of data

Yeah that's a fair point, we're looking at streaming result for the roadmap.

Loki Regular expression query ending up with error

|= is not a regular expression match.

There's no issue really here, you single node query did a good job of 400MB/s.

@mnadeem
Copy link
Author

mnadeem commented Feb 11, 2021

I have got lots of job starting with lets say oc-org-app-hsc

oc-org-app-hsc-a1
oc-org-app-hsc-a2
oc-org-app-hsc-a2

some other apps
oc-org-app-pvc-a2
oc-org-app-pvc-a1

my intent with the following expression was, find 176975405 in all jobs starting with oc-org-app-hsc-

{job!="oc-org-app-hsc-*"}  |= "176975405"

And interestingly it is working like a charm.

Just now I have tried it is working like a charm.

image

if the following is not right approach

{job!="oc-org-app-hsc-*"}  |= "176975405"

then please suggest right approach

@mnadeem
Copy link
Author

mnadeem commented Feb 11, 2021

Note :Following is not working

{job=~"oc-org-app-hsc-*"}  |= "176975405"

@mnadeem
Copy link
Author

mnadeem commented Feb 11, 2021

following is working

{job=~"oc-org-app-hsc-.+"}  |= "176975405"

However it is little slow compared to

{job!="oc-org-app-hsc-*"}  |= "176975405"

@cyriltovena
Copy link
Contributor

{job!="oc-org-app-hsc-*"} |= "176975405" is wrong ! you want to use !~
{job=~"oc-org-app-hsc-.+"} |= "176975405" or even {job!~"oc-org-app-hsc-*"} |= "176975405" is bad, you need at least one equality matcher.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants