Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EMF exporter performance: send EMF logs in batches #2572

Merged
merged 1 commit into from
Mar 11, 2021

Conversation

bjrara
Copy link
Contributor

@bjrara bjrara commented Mar 7, 2021

Why do we need it?
This PR is a follow-up PR of #2571. This PR fixes the performance issue described in aws-observability/aws-otel-collector#388.

Instead of pushing logs by every logEventBatch, it is optimized to push logs in batches. A logEventBatch will be sent if 1) it's full or expires 2) the received metrics are all processed.

@codecov
Copy link

codecov bot commented Mar 7, 2021

Codecov Report

Merging #2572 (86f843d) into main (6566113) will decrease coverage by 0.02%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main    #2572      +/-   ##
==========================================
- Coverage   91.37%   91.34%   -0.03%     
==========================================
  Files         431      431              
  Lines       21461    21473      +12     
==========================================
+ Hits        19609    19615       +6     
- Misses       1387     1391       +4     
- Partials      465      467       +2     
Flag Coverage Δ
integration 69.24% <ø> (ø)
unit 90.25% <100.00%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
exporter/awsemfexporter/emf_exporter.go 100.00% <100.00%> (ø)
internal/stanza/receiver.go 93.93% <0.00%> (-6.07%) ⬇️
...rometheusexecreceiver/subprocessmanager/manager.go 70.00% <0.00%> (-6.00%) ⬇️
processor/groupbytraceprocessor/event.go 95.96% <0.00%> (-0.81%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6566113...86f843d. Read the comment docs.

@bjrara bjrara force-pushed the emf_batch branch 2 times, most recently from d8c575c to 1e5934d Compare March 8, 2021 18:25
@tigrannajaryan tigrannajaryan assigned anuraaga and unassigned owais Mar 8, 2021
Comment on lines +117 to +116
if am.Len() > 0 {
am.ForEach(func(k string, v pdata.AttributeValue) {
labels[k] = v.StringVal()
})
}
Copy link
Member

@mxiamxia mxiamxia Mar 9, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

q: why can't we directly take the labels from groupedMetrics.Labels?

Copy link
Contributor Author

@bjrara bjrara Mar 9, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The labels only serves for the purpose of logging. It should be simple and timely. I separated translateOTelToGroupedMetric and label extraction in the latest commit. Another concern is since groupMetrics.Labels contains extra labels from datapoints, they would blow up the info log.

Comment on lines +147 to +151
for _, pusher := range emf.listPushers() {
returnError := pusher.ForceFlush()
if returnError != nil {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for catching the performance issue here.
Could you combine this code block with the code in func (emf *emfExporter) Shutdown()?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Member

@mxiamxia mxiamxia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@bjrara
Copy link
Contributor Author

bjrara commented Mar 9, 2021

Looks like tests failed due to other components, and there's no way to trigger /retest.

@bjrara bjrara force-pushed the emf_batch branch 4 times, most recently from b8d60d2 to c775ce3 Compare March 10, 2021 07:36
@bogdandrutu bogdandrutu merged commit baaf5f7 into open-telemetry:main Mar 11, 2021
kisieland referenced this pull request in kisieland/opentelemetry-collector-contrib Mar 16, 2021
Bumps [github.com/golang/snappy](https://github.com/golang/snappy) from 0.0.2 to 0.0.3.
- [Release notes](https://github.com/golang/snappy/releases)
- [Commits](golang/snappy@v0.0.2...v0.0.3)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
@bjrara bjrara deleted the emf_batch branch March 18, 2021 23:25
pmatyjasek-sumo referenced this pull request in pmatyjasek-sumo/opentelemetry-collector-contrib Apr 28, 2021
ljmsc referenced this pull request in ljmsc/opentelemetry-collector-contrib Feb 21, 2022
…otlpmetric (#2572)

* Bump google.golang.org/grpc in /exporters/otlp/otlpmetric

Bumps [google.golang.org/grpc](https://github.com/grpc/grpc-go) from 1.43.0 to 1.44.0.
- [Release notes](https://github.com/grpc/grpc-go/releases)
- [Commits](grpc/grpc-go@v1.43.0...v1.44.0)

---
updated-dependencies:
- dependency-name: google.golang.org/grpc
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Auto-fix go.sum changes in dependent modules

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: MrAlias <MrAlias@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants