Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(scripts): pass higher memory limit in buildup and builddown #24905

Merged
merged 3 commits into from
Apr 5, 2023

Conversation

laurelmay
Copy link
Contributor

@laurelmay laurelmay commented Apr 2, 2023

This follows the pattern in #24425 which seems to especially happen
after #24376 when running these scripts.

Closes #24932


By submitting this pull request, I confirm that my contribution is made under the terms of the Apache-2.0 license

This follows the pattern in aws#24425 which seems to especially happen
after aws#24376 when running these scripts.

This follows the pattern in `scripts/gen.sh` or `/build.sh`.
@laurelmay laurelmay changed the title chore(scipts): pass higher memory limit to buildup and builddown chore(scripts): pass higher memory limit to buildup and builddown Apr 2, 2023
@github-actions github-actions bot added the p2 label Apr 2, 2023
@aws-cdk-automation aws-cdk-automation requested a review from a team April 2, 2023 01:19
@github-actions github-actions bot added the admired-contributor [Pilot] contributed between 13-24 PRs to the CDK label Apr 2, 2023
@laurelmay laurelmay changed the title chore(scripts): pass higher memory limit to buildup and builddown chore(scripts): pass higher memory limit in buildup and builddown Apr 2, 2023
@hoegertn
Copy link
Contributor

hoegertn commented Apr 2, 2023

Shouldn't this be 8192?

Also this means that gitpod no longer works

@mrgrain
Copy link
Contributor

mrgrain commented Apr 2, 2023

Shouldn't this be 8192?

Also this means that gitpod no longer works

@hoegertn Do you know how increasing the memory limit "breaks" gitpod? Is there a safe number?

@hoegertn
Copy link
Contributor

hoegertn commented Apr 3, 2023

Breaking might not be the "right" word but a default GitPod env has 8GB of RAM (https://www.gitpod.io/docs/configure/workspaces/workspace-classes)

So maxing out on that limit might not be a good idea.

"No longer working was too hard" I remembered lower limits on GitPod, my fault.

@mrgrain
Copy link
Contributor

mrgrain commented Apr 3, 2023

Breaking might not be the "right" word but a default GitPod env has 8GB of RAM (https://www.gitpod.io/docs/configure/workspaces/workspace-classes)

So maxing out on that limit might not be a good idea.

"No longer working was too hard" I remembered lower limits on GitPod, my fault.

Thanks for clarification! I meant to look at the gitpod setup this week and will try and see if there's a better value for it.

@mrgrain
Copy link
Contributor

mrgrain commented Apr 3, 2023

6GB aka 6144 seems to work as well.

@laurelmay
Copy link
Contributor Author

Thanks for the comments @hoegertn and @mrgrain! I've set this to 6144 for buildup and builddown and left the other scripts (that probably wouldn't typically be used in Gitpod).

@github-actions github-actions bot added the bug This issue is a bug. label Apr 5, 2023
Copy link
Contributor

@TheRealAmazonKendra TheRealAmazonKendra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Thanks!

@mergify
Copy link
Contributor

mergify bot commented Apr 5, 2023

Thank you for contributing! Your pull request will be updated from main and then merged automatically (do not update manually, and be sure to allow changes to be pushed to your fork).

@TheRealAmazonKendra
Copy link
Contributor

I also just want to provide an FYI in case anyone is unaware and checking out this PR, if you don't need a stateful buildup or builddown, you can use lerna exec --stream --include-dependencies --scope=<package-name> yarn build for buildup or lerna exec --stream --include-dependents --scope=<package-name> yarn build for builddown. It's fast enough on my machine that I don't typically care about the loss of ability to restart the build from the failed package.

@aws-cdk-automation
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: AutoBuildv2Project1C6BFA3F-wQm2hXv2jqQv
  • Commit ID: 6a9c983
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@mergify mergify bot merged commit 6e77287 into aws:main Apr 5, 2023
@mergify
Copy link
Contributor

mergify bot commented Apr 5, 2023

Thank you for contributing! Your pull request will be updated from main and then merged automatically (do not update manually, and be sure to allow changes to be pushed to your fork).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
admired-contributor [Pilot] contributed between 13-24 PRs to the CDK bug This issue is a bug. p2
Projects
None yet
Development

Successfully merging this pull request may close these issues.

contributing: buildup failing on Gitpod
5 participants