Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: restrict MaxBytes to values at or below that of MaxSquareSize #1743

Closed
wants to merge 4 commits into from

Conversation

evan-forbes
Copy link
Member

Overview

Leaving as a draft for now until we merge celestiaorg/cosmos-sdk#317 and decide on if this is actually the approach that we want to use to try to limit the square size.

closes #1592 closes #1737

Checklist

  • New and updated code has appropriate documentation
  • New and updated code has new and/or updated testing
  • Required CI checks are passing
  • Visual proof for any user facing features like CLI or documentation updates
  • Linked issues closed with keywords

@evan-forbes evan-forbes added this to the Mainnet milestone May 12, 2023
@evan-forbes evan-forbes self-assigned this May 12, 2023
Comment on lines +71 to +74
maxBlockBytes := square.EstimateMaxBlockBytes(squareSize)
if v.MaxBytes > maxBlockBytes {
return fmt.Errorf("block maximum bytes must be less than or equal to %d: %d", maxBlockBytes, v.MaxBytes)
}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is consensus breaking, and tbh I'm not really even sure if we should include it since the only real value it provides is throwing an error if people submit a param proposal to change the block size to something that could get above 128 x 128. By doing so, until celestiaorg/celestia-core#1007, we could easily end up producing less efficient squares in some scenarios.

@evan-forbes evan-forbes changed the title feat!: limit the block size param to limit square size feat!: restrict MaxBytes to values at or below that of MaxSquareSize May 12, 2023
@evan-forbes
Copy link
Member Author

while some of the changes here are ok, this branch mainly introduces a lot of code to simply stop proposals that exceed the max square value. Perhaps its nice to throw an error in that scenario, but its far from necessary and add complexity, so I'm closing this for now.

if that's something that we decide we want, then we can reopen

Comment on lines +30 to +52
// using lots of individual small blobs will result in a large amount of
// overhead added to the square
seqs := txsim.NewBlobSequence(
txsim.NewRange(1, 10000),
txsim.NewRange(1, 50),
).Clone(25)

ctx, cancel := context.WithTimeout(context.Background(), time.Second*30)
defer cancel()

_ = txsim.Run(
ctx,
[]string{rpcAddr},
[]string{grpcAddr},
kr,
rand.Int63(),
time.Second,
seqs...,
)

// check the block sizes
blocks, err := testnode.ReadBlockchain(context.Background(), rpcAddr)
require.NoError(t, err)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

txsim is cool tho

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

make default block size limit be 1.8mb EPIC: Hard capped blocksize
1 participant