Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 0.23 - February 2019 #6495

Closed
laurentlb opened this issue Oct 24, 2018 · 31 comments
Closed

Release 0.23 - February 2019 #6495

laurentlb opened this issue Oct 24, 2018 · 31 comments
Assignees
Labels

Comments

@laurentlb
Copy link
Contributor

Target RC date: February 1st

@philwo
Copy link
Member

philwo commented Jan 17, 2019

@katre Would you mind to swap with me? I'm assigned to the 0.24 release in March, but I'm on vacation the whole month.

@katre
Copy link
Member

katre commented Jan 17, 2019

Sure, I'm fine to make that change.

@katre katre assigned philwo and unassigned katre Jan 17, 2019
@philwo philwo pinned this issue Jan 28, 2019
@aiuto aiuto unpinned this issue Feb 4, 2019
@cushon cushon pinned this issue Feb 5, 2019
@laurentlb
Copy link
Contributor Author

Any update @philwo?

@philwo
Copy link
Member

philwo commented Feb 5, 2019

Marcel and me spent today with fixing breakages on our downstream pipelines and getting important last minute fixes in, without which we can’t judge whether a given commit is a good baseline / release.

I think I’ll cut 0.23.0rc1 either this evening or tomorrow, as we sorted most things out now.

@thundergolfer
Copy link
Contributor

@philwo Apologies if this is a stupid question, but how do you discover what will get included in 0.23? I'm particularly interested in a92347e

@philwo
Copy link
Member

philwo commented Feb 6, 2019

@thundergolfer Not stupid at all. Our official release playbook is here: https://github.com/bazelbuild/continuous-integration/blob/master/docs/release-playbook.md

Basically, the release manager takes a good look at the last nightly downstream pipeline and decides on a good commit. I'll pick one today, probably just the current HEAD as the pipeline looks pretty good. The last run only had three failures:

So your linked commit will definitely be included in the next release. :)

@thundergolfer
Copy link
Contributor

@philwo Thanks for the detailed explanation 🙇

@philwo
Copy link
Member

philwo commented Feb 6, 2019

Looks like we're good to go. I picked this commit as the baseline: 441fd75

Log:

scripts/release/release.sh create 0.23.0 441fd75d0047f8a998d784c557736ab9075db893
scripts/release/release.sh push

@philwo
Copy link
Member

philwo commented Feb 6, 2019

Bazel 0.23.0rc1 was pushed to https://releases.bazel.build/0.23.0/rc1/index.html.

@philwo
Copy link
Member

philwo commented Feb 6, 2019

Bazel 0.23.0rc1 passed all of its own tests: https://buildkite.com/bazel/bazel-bazel/builds/6638

I will run the full downstream pipeline later, but don't expect any issues, as the last run was completely green apart from known issues that have been addressed.

@ulfjack
Copy link
Contributor

ulfjack commented Feb 13, 2019

We should make sure that 0.23.0 contains a fix for #7410. I suspect it already does, so that may not require any additional work.

@philwo
Copy link
Member

philwo commented Feb 14, 2019

Creating Bazel 0.23.0rc2 with a fix for #7397:

scripts/release/release.sh create --force_rc=2 0.23.0 441fd75d0047f8a998d784c557736ab9075db893 6ca7763669728253606578a56a205bca3ea883e9
scripts/release/release.sh push

@philwo
Copy link
Member

philwo commented Feb 19, 2019

@ulfjack I think this release is fine (I deliberately cut it after your flag flip went in, in order to reap its benefits for users and our CI ^^).

Everyone: I'm not aware of any release blocking issues for Bazel 0.23.0. If that doesn't change, I'll release Bazel 0.23.0 tomorrow (14 days since rc1 will have passed then).

@ulfjack
Copy link
Contributor

ulfjack commented Feb 19, 2019

There is one corner case - if someone's disabling the flag and using bash 5, then they'll see the error. There's a pending CL (in process of being submitted) to fix that as well, maybe we want to cherrypick that? (It's a two-line CL.)

@philwo philwo closed this as completed Feb 19, 2019
@philwo philwo reopened this Feb 19, 2019
@philwo
Copy link
Member

philwo commented Feb 19, 2019

I don't think it's worth it to do a cherry-pick (or even a patch release for 0.22.0) for that due to the following reasons:

  • The cherry-pick would delay the release until Thursday thanks to our "2 business days passed since you pushed the last RC" policy, which is also the day I'm leaving for vacation.
  • Bash 5 is an unsupported / untested setup. We don't test it on CI and none of our officially supported OS uses Bash 5 yet, so if it breaks, that's not great, but it's not like we promised that it works. (I'm fine with adding a more modern Linux that uses Bash 5 to our CI to test this in the future.)
  • Running Bazel with a disabled feature flag also isn't really a supported configuration. If that then breaks (and only in combination with using an unsupported Bash version), I'd say, that's just bad luck.
  • It's also not a regression compared to earlier Bazel releases.

@meteorcloudy
Copy link
Member

meteorcloudy commented Feb 19, 2019

This one looks like a release blocker for 0.23.0
#7459
Also happens on our CI, reported at #7464

@ulfjack
Copy link
Contributor

ulfjack commented Feb 19, 2019

For the record, commit is 2310b1c.

@philwo
Copy link
Member

philwo commented Feb 19, 2019

Thanks, Ulf. If we need another RC anyway due to that other breakage, let's also cherry pick that one. 👍

@meisterT
Copy link
Member

f9eb1b5 needs to be cherry picked
cc @laurentlb

@laurentlb
Copy link
Contributor Author

laurentlb commented Feb 22, 2019

I wanted to help with the release, but the number of problems is too big for a Friday evening.

Commands I've run (the previous cherry-pick + two new cherry-picks):

Issues:

Edit. Philipp gave me the rights. I've triggered the downstream projects pipeline and rerun the Windows tests. I've deployed the artifacts.

@laurentlb
Copy link
Contributor Author

rc3 is available: https://releases.bazel.build/0.23.0/rc3/index.html

Issues:

I've deployed the artifacts as both issues seem unrelated to Bazel itself; it looks like infrastructure bugs.

@philwo
Copy link
Member

philwo commented Feb 23, 2019

This is a known issue (reported by some Googlers internally as b/125831768, was supposedly but not really fixed by bazelbuild/continuous-integration#496). The test is broken in that it doesn't run the bazel from its runfiles, instead it accidentally runs the bazel installed on the system, so doesn't test the Bazel from your workspace at all.

We can ignore this for now. It only started failing now due to infra changes on CI (installed Bazelisk instead of Bazel, which crashes when $HOME or %LocalAppDir% are not set in the environment), but it probably has always been broken.

@philwo
Copy link
Member

philwo commented Feb 24, 2019

  • could not create directory /Users/buildkite/Library/Caches/bazelisk: mkdir /Users/buildkite/Library/Caches/bazelisk: operation not permitted

@fweikert This looks like Tulsi is running integration tests (?) that run Bazel (= Bazelisk) in a sandbox, which means it doesn't have access to $HOME/Library/Caches/bazelisk.

One way to fix this is to add --sandbox_writable_path=/Users/buildkite/Library/Caches/bazelisk to the test_flags of their bazelci.yml config for macOS (or maybe just add this in general to all tests running on macOS via bazelci.py here: https://github.com/bazelbuild/continuous-integration/blob/master/buildkite/bazelci.py#L1106).

Another way would be to add no-sandbox to the tags of the failing tests.

@philwo
Copy link
Member

philwo commented Feb 24, 2019

@laurentlb Would you be OK with formally taking over this release?

I think there isn't much left to do except cherry-picking fixes for whatever regression still come up and then eventually releasing it.

I prettified the release notes manually for rc1 already and they are LGTM'd by Serge.

@laurentlb
Copy link
Contributor Author

I think I can release later today. Let me know if I missed anything.

There's still some cleaning to do in the release notes, in case someone wants to help (https://docs.google.com/document/d/1wDvulLlj4NAlPZamdlEVFORks3YXJonCjyuQMUQEmB0/edit#heading=h.gpn2s9t6bosr).

@laurentlb
Copy link
Contributor Author

@vbatts
Copy link

vbatts commented Feb 26, 2019 via email

@thundergolfer
Copy link
Contributor

thundergolfer commented Feb 26, 2019

Nice!

I would have thought java_binary.deploy_env would have made it under the "Important changes:" section.

Edit: Originally had java_library which was wrong

@laurentlb
Copy link
Contributor Author

I'm not familiar with this specific change, but here's our process:

For the release notes, we first gather all commits with a RELNOTES tag in the description. Then we ask developers to review the announcement. If we missed something, you can still send a PR on https://github.com/bazelbuild/bazel-blog

@thundergolfer
Copy link
Contributor

It's a change that is relevant to people looking to get Bazel support for Maven's provided or Gradle's compileOnly features.

Thanks for that info I'll send a PR if I get some time.

@meisterT
Copy link
Member

meisterT commented Feb 27, 2019

Laurent: this needs to be cherry picked into a patch release f0a1597 to fix #7555. Thanks!

@laurentlb laurentlb unpinned this issue Feb 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

9 participants