Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

groupbytraceprocessor: TestPeriodicMetrics is unstable #1927

Closed
tigrannajaryan opened this issue Oct 8, 2020 · 4 comments · Fixed by open-telemetry/opentelemetry-collector-contrib#1364
Assignees
Labels
bug Something isn't working

Comments

@tigrannajaryan
Copy link
Member

See https://app.circleci.com/pipelines/github/open-telemetry/opentelemetry-collector/3878/workflows/16e17361-807e-43fb-995a-3e5a517bdef2/jobs/42647

2020-10-08T22:10:41.128Z	DEBUG	groupbytraceprocessor/event.go:93	recording current state of the queue	{"num-events": 0}
--- FAIL: TestPeriodicMetrics (0.05s)
    event_test.go:411: 
        	Error Trace:	event_test.go:411
        	            				event_test.go:360
        	Error:      	Not equal: 
        	            	expected: int(1)
        	            	actual  : float64(2)
        	Test:       	TestPeriodicMetrics
    event_test.go:411: 
        	Error Trace:	event_test.go:411
        	            				event_test.go:366
        	Error:      	Not equal: 
        	            	expected: int(0)
        	            	actual  : float64(1)
        	Test:       	TestPeriodicMetrics
@tigrannajaryan tigrannajaryan added the bug Something isn't working label Oct 8, 2020
@tigrannajaryan
Copy link
Member Author

@jpkrohling sorry, one more unstable, assigning to you.

@jpkrohling
Copy link
Member

Windows again? Are the tests slower on Windows? I'll work on it, but it might be a good hint that perf tests on Windows might yield interesting results :-)

@tigrannajaryan
Copy link
Member Author

Yes, Windows. Yes, possibly could be caused by a perf difference.

@jpkrohling
Copy link
Member

@tigrannajaryan could you please move this to -contrib? I'm working on it right now.

hughesjj pushed a commit to hughesjj/opentelemetry-collector that referenced this issue Apr 27, 2023
…y#1927)

Bumps [boto3](https://github.com/boto/boto3) from 1.24.63 to 1.24.64.
- [Release notes](https://github.com/boto/boto3/releases)
- [Changelog](https://github.com/boto/boto3/blob/develop/CHANGELOG.rst)
- [Commits](boto/boto3@1.24.63...1.24.64)

---
updated-dependencies:
- dependency-name: boto3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Troels51 pushed a commit to Troels51/opentelemetry-collector that referenced this issue Jul 5, 2024
* Upgrade semantic convention to spec 1.17.0

* Add CHANGELOG.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants