Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow to tune the read/write buffers for gRPC server #1218

Merged
merged 1 commit into from
Jun 29, 2020

Conversation

bogdandrutu
Copy link
Member

No description provided.

Signed-off-by: Bogdan Drutu <bogdandrutu@gmail.com>
@codecov
Copy link

codecov bot commented Jun 27, 2020

Codecov Report

Merging #1218 into master will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1218   +/-   ##
=======================================
  Coverage   88.07%   88.07%           
=======================================
  Files         203      203           
  Lines       14658    14664    +6     
=======================================
+ Hits        12910    12916    +6     
  Misses       1310     1310           
  Partials      438      438           
Impacted Files Coverage Δ
config/configgrpc/configgrpc.go 100.00% <100.00%> (+3.38%) ⬆️
receiver/opencensusreceiver/factory.go 95.00% <100.00%> (+0.12%) ⬆️
receiver/otlpreceiver/factory.go 95.00% <100.00%> (+0.12%) ⬆️
translator/internaldata/resource_to_oc.go 83.72% <0.00%> (-2.33%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 12b3d9c...fb2342e. Read the comment docs.

Copy link
Member

@tigrannajaryan tigrannajaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am curious: do you notice any performance details when you increase these buffers?


// The WriteBufferSize for client gRPC. See grpc.ReadBufferSize
// (https://godoc.org/google.golang.org/grpc#ReadBufferSize).
ReadBufferSize int `mapstructure:"read_buffer_size"`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this make any difference? In my similar experiments with WebSockets reading buffers did not make any difference since we are reading in big chunks. But perhaps gRPC is different, it will be interesting to know if increasing this buffer size makes a difference.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw some difference in gRPC on my Mac not as much as on the client but still something.


// The WriteBufferSize for client gRPC. See grpc.WriteBufferSize
// (https://godoc.org/google.golang.org/grpc#WriteBufferSize).
WriteBufferSize int `mapstructure:"write_buffer_size"`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should not matter. We are not sending anything big from our servers, only tiny responses.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct but want to have them to be completed also because it may matter in a pull protocol.

Copy link
Member

@tigrannajaryan tigrannajaryan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tigrannajaryan tigrannajaryan merged commit 7ef6c53 into open-telemetry:master Jun 29, 2020
@bogdandrutu bogdandrutu deleted the serverbuf branch June 29, 2020 16:34
wyTrivail pushed a commit to mxiamxia/opentelemetry-collector that referenced this pull request Jul 13, 2020
hughesjj pushed a commit to hughesjj/opentelemetry-collector that referenced this pull request Apr 27, 2023
swiatekm pushed a commit to swiatekm/opentelemetry-collector that referenced this pull request Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants