Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data race detected #1918

Closed
mozillazg opened this issue Mar 19, 2022 · 5 comments
Closed

Data race detected #1918

mozillazg opened this issue Mar 19, 2022 · 5 comments
Labels
bug Something isn't working wontfix This will not be worked on

Comments

@mozillazg
Copy link
Contributor

What steps did you take and what happened:
[A clear and concise description of what the bug is.]
Found a data race when running the e2e test with enable race detect:

Enable race detect:

diff --git a/Dockerfile b/Dockerfile
index d2ce8a1f..8ad0c6f2 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -27,9 +27,10 @@ COPY main.go main.go
 COPY apis/ apis/
 COPY go.mod .

-RUN go build -mod vendor -a -ldflags "${LDFLAGS:--X github.com/open-policy-agent/gatekeeper/pkg/version.Version=latest}" -o manager main.go
+ENV CGO_ENABLED=1
+RUN go build -mod vendor -race -a -ldflags "${LDFLAGS:--X github.com/open-policy-agent/gatekeeper/pkg/version.Version=latest}" -o manager main.go

-FROM $BASEIMAGE
+FROM $BUILDERIMAGE

 WORKDIR /

Run e2e test:

$ make e2e-build-load-image IMG=gatekeeper-e2e:latest CRD_IMG=gatekeeper-crds:latest
$ make deploy IMG=gatekeeper-e2e:latest USE_LOCAL_IMG=true
$ make test-e2e

Check logs:

$ for i in $(kubectl -n gatekeeper-system get pod -o name); do \
   kubectl -n gatekeeper-system logs $i |grep RACE && echo $i; done
WARNING: DATA RACE
pod/gatekeeper-controller-manager-6c4dff4955-d52hm

$ kubectl -n gatekeeper-system logs pod/gatekeeper-controller-manager-6c4dff4955-d52hm | \
   grep RACE -A 515 -B 10 > race.log

$ $ kubectl -n gatekeeper-system logs pod/gatekeeper-controller-manager-6c4dff4955-d52hm > gatekeeper-controller-manager-6c4dff4955-d52hm.log

race.log
gatekeeper-controller-manager-6c4dff4955-d52hm.log

What did you expect to happen:

No data race

Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]

Environment:

  • Gatekeeper version: master branch( 6b10786 )
  • Kubernetes version: (use kubectl version): v1.23.4
@mozillazg mozillazg added the bug Something isn't working label Mar 19, 2022
@maxsmythe
Copy link
Contributor

Thanks for this!

@srenatus It looks like these warnings are coming from the webhook calling OPA:

==================
WARNING: DATA RACE
Read at 0x00c000bd8408 by goroutine 304:
  github.com/open-policy-agent/opa/ast.(*Array).Hash()
      /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/ast/term.go:1182 +0x3b
  github.com/open-policy-agent/opa/ast.(*Term).Hash()
      /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/ast/term.go:401 +0x62
  github.com/open-policy-agent/opa/topdown.newVirtualCacheHashMap.func2()
      /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/cache.go:70 +0x1d
  github.com/open-policy-agent/opa/util.(*HashMap).Get()
      /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/util/hashmap.go:66 +0x61
  github.com/open-policy-agent/opa/topdown.(*virtualCache).Put()
==================
WARNING: DATA RACE
Read at 0x00c000bd8408 by goroutine 304:
github.com/open-policy-agent/opa/ast.(*Array).Hash()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/ast/term.go:1182 +0x3b
github.com/open-policy-agent/opa/ast.(*Term).Hash()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/ast/term.go:401 +0x62
github.com/open-policy-agent/opa/topdown.newVirtualCacheHashMap.func2()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/cache.go:70 +0x1d
github.com/open-policy-agent/opa/util.(*HashMap).Get()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/util/hashmap.go:66 +0x61
github.com/open-policy-agent/opa/topdown.(*virtualCache).Put()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/cache.go:50 +0x137
github.com/open-policy-agent/opa/topdown.evalFunc.evalOneRule.func1.1()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/eval.go:1762 +0x38e
github.com/open-policy-agent/opa/topdown.(*eval).evalExpr()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/eval.go:305 +0x119
github.com/open-policy-agent/opa/topdown.(*eval).next()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/eval.go:162 +0x84
github.com/open-policy-agent/opa/topdown.(*eval).evalExpr.func1()
    /go/src/github.com/open-policy-agent/gatekeeper/vendor/github.com/open-policy-agent/opa/topdown/eval.go:331 +0x85

Is OPA affected by them?

@srenatus
Copy link
Contributor

@maxsmythe That looks like an issue fixed in 0.38.0. Both calls to array.Hash() will yield the same hash, but it's a race nonetheless. I'd first try updating OPA to resolve this 🤞

@maxsmythe
Copy link
Contributor

Thanks!

@stale
Copy link

stale bot commented Jul 23, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 14 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Jul 23, 2022
@sozercan
Copy link
Member

fixed in #1944

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

4 participants