Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

groupbytraceprocessor: TestErrorOnProcessResourceSpansContinuesProcessing is unstable #1923

Closed
tigrannajaryan opened this issue Oct 8, 2020 · 3 comments · Fixed by open-telemetry/opentelemetry-collector-contrib#1363
Assignees
Labels
bug Something isn't working

Comments

@tigrannajaryan
Copy link
Member

See https://app.circleci.com/pipelines/github/open-telemetry/opentelemetry-collector/3868/workflows/6cbedc3f-98dd-439d-ae9f-987aa9e3e8e2/jobs/42433

2020-10-08T16:32:15.916Z	WARN	groupbytraceprocessor/processor.go:133	failed to process batch	{"error": "couldn't add spans to new trace: some unexpected error", "traceID": "01020304"}
go.opentelemetry.io/collector/processor/groupbytraceprocessor.(*groupByTraceProcessor).processResourceSpans
	C:/Users/circleci/project/processor/groupbytraceprocessor/processor.go:133
go.opentelemetry.io/collector/processor/groupbytraceprocessor.TestErrorOnProcessResourceSpansContinuesProcessing
	C:/Users/circleci/project/processor/groupbytraceprocessor/processor_test.go:708
testing.tRunner
	c:/go/src/testing/testing.go:1108
FAIL
FAIL	go.opentelemetry.io/collector/processor/groupbytraceprocessor	0.728s
@tigrannajaryan tigrannajaryan added the bug Something isn't working label Oct 8, 2020
@tigrannajaryan
Copy link
Member Author

@jpkrohling can you please have a look?

tigrannajaryan pushed a commit to tigrannajaryan/opentelemetry-collector that referenced this issue Oct 8, 2020
This test is very unstable on windows.

Issue for proper fix is filed:
open-telemetry#1923
pjanotti pushed a commit that referenced this issue Oct 8, 2020
This test is very unstable on windows.

Issue for proper fix is filed:
#1923
@jpkrohling
Copy link
Member

This one should also be moved to -contrib, please.

@jpkrohling
Copy link
Member

Actually, this should be closed, as it's a duplicate of #1927. The error in the test log isn't unexpected, it's actually part of the test:

2020-10-08T16:32:15.916Z	WARN	groupbytraceprocessor/processor.go:133	failed to process batch	{"error": "couldn't add spans to new trace: some unexpected error", "traceID": "01020304"}
go.opentelemetry.io/collector/processor/groupbytraceprocessor.(*groupByTraceProcessor).processResourceSpans
	C:/Users/circleci/project/processor/groupbytraceprocessor/processor.go:133
go.opentelemetry.io/collector/processor/groupbytraceprocessor.TestErrorOnProcessResourceSpansContinuesProcessing
	C:/Users/circleci/project/processor/groupbytraceprocessor/processor_test.go:708
testing.tRunner
	c:/go/src/testing/testing.go:1108

If we scroll up the test logs from the link in the issue description, you'll see this:

--- FAIL: TestPeriodicMetrics (0.04s)
    event_test.go:411: 
        	Error Trace:	event_test.go:411
        	            				event_test.go:366
        	Error:      	Not equal: 
        	            	expected: int(0)
        	            	actual  : float64(1)
        	Test:       	TestPeriodicMetrics

So, the actual action to take on this issue here is to enable back the test :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants