-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add trusted publisher release workfiow #13048
base: main
Are you sure you want to change the base?
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,24 @@ | ||||||||||
on: | ||||||||||
push: | ||||||||||
tags: | ||||||||||
- "*" | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is unnecessary, it's the same by default:
Suggested change
|
||||||||||
|
||||||||||
name: Release | ||||||||||
|
||||||||||
jobs: | ||||||||||
pypi: | ||||||||||
name: Upload release to PyPI | ||||||||||
runs-on: ubuntu-latest | ||||||||||
environment: release | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Are we going to include any deployment protection rules for the release environment? I realize that means requiring a second person to briefly review any release deployments, but that seems prudent given the rise of supply-chain attacks. For example, if I were to be a maintainer, I don't think I'd need the ability to cut a release completely solo. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. What would we ask a second reviewer to verify before approving the release? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The steps in the release process that this replaces are all ones that happen locally. So there's nothing visible that a second reviewer could check. It's still perfectly fine for someone doing a release to stop just before pushing the tag and asking for a review of the commit that will become the release if they feel the need to do so. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I was more so suggesting that we require a second pair of eyes for any release as a defence-in-depth mechanism against account takeover and other supply-chain attacks. Realistically, we would be compromised in other ways anyway and it isn't worth the hassle. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Hmm, surprisingly I can't mark my own review comment as resolved? Either way, I have no further comments. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Disclaimer: I've spent an hour writing this. I tried to present a logically consistent argument, but if it starts to run off course, you know why.
Fair enough. I brought them up as even if we introduced stricter protection rules and dropped the base pip committer permission, they would still retain full administrator permissions. Thus, they would logically be the starting point for deciding who would be in the "project owner" group. We definitely don't have to tie project ownership to PyPA ownership. And yeah, GitHub's permission model for organisation repositories is not exactly flexible. I chose Maintain simply as it's the highest access role w/o administrator (i.e. bypass everything) permissions.
Agreed.
I don't think having granular committer/owner roles is antithetical to the "team effort" approach to pip. When I used to maintain Black, I only retained committer access to the project, but I still felt fully able to contribute.1 Actually, I used to be effectively the lead maintainer when I had so much free time. Sure, technically I couldn't do as much as the "project owners", but 99% of what I did only required committer access anyway. More importantly, we still treated each other's opinions and suggestions from a level playing field. Yes, there was a technical imbalance, but socially, no. I suppose this is a difference of perspective, but I consider the division of committer access as a reasonable compromise between ease of maintenance and security. If everyone has administrator access, then, yes, when we need to update a sensitive setting (say to land #13107), it's easier as we don't have to bother a specific person (or group of people). OTOH, it's rare when we actually need do something that requires administrator access (editing protection rules, GHA settings, webhooks, etc). In the vast majority of time where we don't need admin access, those accounts represent a larger than necessary security risk (if they were to be compromised somehow). The small additional friction to do X thing is IMO outweighed by the security benefits.
I trust everyone else on the pip committer team to be a RM and administrator on the project as well. If a pip committer needed to be an administrator for a legitimate reason, I have no objections to extending that permission to them. I also trust everyone to take proper measures to secure their accounts. Adding "security levels" doesn't change that for me. The problem is that mistakes happen. Access tokens get mistakenly leaked, credentials get phished or bruteforced, and a variety of other creative attacks occur. We should be prepared for that.
I'm actually not entirely in favour of requiring reviews for similar reasons. I do think it's worth it to drop the base permission and require reviews for releases as they're infrequent events. Of course, if we didn't have to worry about security then none of this would need to be discussed, but for better or worse, we don't live in such a world anymore :( Also, I haven't been a RM before (and I don't have any plans to be one soon, for lack of time). If you, one of our regular RMs, find these suggestions to be too onerous, then I'm fine with maintaining the status quo. Footnotes
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I don't disagree with that. I think part of my "hidden agenda" here is that if we're going to have distinct project owners, then I'd prefer not to be one, simply because I don't want to be a bottleneck on actions that require an admin1. I also don't want to be characterised as "owning" pip, because I'm very conscious of our shortcomings as a project, and I could do without feeling more responsible for that. But I do want to remain a PyPA owner, as I feel that I have a useful role in that context. That's all very muddled, and not really actionable, but not having an "owner vs committer" distinction brushes the problem under the carpet, which works for me 🙂
I think I might. My usual release workflow is that I set aside a Saturday morning for doing the release. I'll manage all the outstanding PRs and milestones in the week or two before the release, then on the day I'll go through the release process and I'm done. I keep track of the release in my head, which is fine as it's a single piece of work with no interruptions. Adding a review to that process would introduce a delay where I'd need someone else to be available, and approve the release - there's no guarantee that would happen in the sort of timescale (an hour or so max) that I'm working to. So I think I'd need to see the Release Process section of the docs split into two parts in order to incorporate a review. Do part 1, request a review, then do part 2. And I wouldn't be able to guarantee when I could assign time for part 2 in advance, because I don't know when I'll get an approval. So the git repo needs to be (at least in some sense) frozen between parts 1 and 2, which could be days apart. Am I overthinking this? It feels like requiring a review during the release process necessitates splitting the process into two parts like I describe above, which is bad for scheduling. But maybe I've misunderstood how getting a review would work in practice?
Agreed. But conversely, admin around security is not what I want to spend my volunteer open source time on, so keeping things streamlined is important to me. Footnotes
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
@pradyunsg @ichard26 hint: it is possible to disallow self-approvals in the environment protections, FYI. Also, it's possible to add up to 6 entries into the required reviewers list there. Not only users, but teams — you can let more people approve if you use teams. Although, I tend to enable required reviews even on projects where I'm the only committer. This allows me to have more control over the process, and this pauses the workflow run just before starting the job. So when the build job produces the wheels, I could even download them locally, and inspect if I wanted to. @pfmoore similarly, to address the “delay” concern — setting up required reviews with just 1 reviewer required and self-reviews allowed would let you have just enough control over the last action which is immutable (the actual PyPI upload) w/o contributing to any delay meaningfully. This is what I tend to configure. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would like to come back to my initial question: What would we ask a second reviewer to verify before approving the release? When we setup a review process, I think it is important to explain what is expected from the reviewer. When reviewing or merging a PR, this is implicit but I think generally accepted: the person who approves or merges communicates that they have looked at, and agree with the change, unless explicitly mentioned otherwise. Currently, as a RM preparing a release, I do not re-review nor look at everything that was merged in the last quarter to assure there is no malicious code that was merged on main. In effect I assume that the review process was effective and guarded against malicious intents. If we introduce an approval step in the release process, what would the reviewer need to do in that step? Downloading the built wheel and sdist and inspecting them does certainly not looks like something that would be very practical to me. This is a genuine question. So if we want to guard against compromise of a maintainer GitHub account, I'd think a second approval on release is good, but only effective if we also protect As a side note, I also think it would somewhat complicate the release process which I usually do when I have time on a weekend, and not currently coordinating with availability of another maintainer. But I'll adapt if we reach the conclusion that this is important, of course. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Agreed. My RM process is very similar. And in particular, the assumption that what is merged on main is correct, valid and ready for release. If we were to require the RM to (in effect) validate all code that went into the release I'm pretty certain I wouldn't have the time to be a RM. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I've recently fixed the suggestion of a misleading environment name in Warehouse. And William and I made sure that all the PyPUG refs and the official GitHub docs suggest
Suggested change
GitHub Environments actually represent deployment targets/platform. It does not represent a process. You're uploading to PyPI so it makes sense to call it The URLs will show up @ https://github.com/pypa/pip/deployments. Since the version is available in the workflow, they can easily be versioned. Here's an example: https://github.com/cherrypy/cheroot/deployments/pypi. It will also show up on the GitHub Actions workflow run, in the graph. |
||||||||||
permissions: | ||||||||||
# Used to authenticate to PyPI via OIDC. | ||||||||||
id-token: write | ||||||||||
steps: | ||||||||||
- uses: actions/checkout@v4 | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Pin all the action steps to commit SHAs instead of git tags to avoid a source of immutability. You can use frizbee to do this for you if you'd like. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There's also https://github.com/davidism/gha-update. And Dependabot knows to update the hashes too (also bumping the human-readable tag in a comment on the same line). |
||||||||||
- uses: actions/setup-python@v5 | ||||||||||
with: | ||||||||||
python-version: "3.x" | ||||||||||
- name: Build sdist and wheel | ||||||||||
run: pipx run build | ||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Please, DO NOT run build scripts in the same job as publishing. Giving OIDC to (transitive) build deps allows them to impersonate the identity of your repo / workflow, leading to the attack surfaces beyond infecting the dist — this can be used for privilege escalation in systems beyond PyPI. I had to spell out that this is not supported the other day: https://github.com/marketplace/actions/pypi-publish#Non-goals. This is essentially the only workflow/job shape I'm willing to recommend and call supported: https://packaging.python.org/en/latest/guides/publishing-package-distribution-releases-using-github-actions-ci-cd-workflows/#publishing-the-distribution-to-pypi. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Are you sure the syntax is correct? IIRC FWIW I tend to invoke this wrapped with tox. So if there's anything in nox already — use that here so the same command+env is used everywhere. Additionally, I'd like to point out that this command is unversioned. With @sethmlarson's suggestions to pin the actions, it feels like a missing reproducibility bit when this is uncontrolled. Perhaps, put the dependencies into constraint files? And I just remembered that I like setting the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Yes, There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ack. Though, I'm still asking you to consider #13048 (comment). |
||||||||||
- name: Publish to PyPI | ||||||||||
uses: pypa/gh-action-pypi-publish@release/v1 |
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,7 +2,6 @@ | |
""" | ||
|
||
import argparse | ||
import glob | ||
import os | ||
import shutil | ||
import sys | ||
|
@@ -315,94 +314,3 @@ def prepare_release(session: nox.Session) -> None: | |
next_dev_version = release.get_next_development_version(version) | ||
release.update_version_file(next_dev_version, VERSION_FILE) | ||
release.commit_file(session, VERSION_FILE, message="Bump for development") | ||
|
||
|
||
@nox.session(name="build-release") | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we keep this env so nox is still in the center of the processes and the manual ways remain in the repo? Plus, consider implementing pinning of the build env on this level per my other comments. |
||
def build_release(session: nox.Session) -> None: | ||
version = release.get_version_from_arguments(session) | ||
if not version: | ||
session.error("Usage: nox -s build-release -- YY.N[.P]") | ||
|
||
session.log("# Ensure no files in dist/") | ||
if release.have_files_in_folder("dist"): | ||
session.error( | ||
"There are files in dist/. Remove them and try again. " | ||
"You can use `git clean -fxdi -- dist` command to do this" | ||
) | ||
|
||
session.log("# Install dependencies") | ||
session.install("build", "twine") | ||
|
||
with release.isolated_temporary_checkout(session, version) as build_dir: | ||
session.log( | ||
"# Start the build in an isolated, " | ||
f"temporary Git checkout at {build_dir!s}", | ||
) | ||
with release.workdir(session, build_dir): | ||
tmp_dists = build_dists(session) | ||
|
||
tmp_dist_paths = (build_dir / p for p in tmp_dists) | ||
session.log(f"# Copying dists from {build_dir}") | ||
os.makedirs("dist", exist_ok=True) | ||
for dist, final in zip(tmp_dist_paths, tmp_dists): | ||
session.log(f"# Copying {dist} to {final}") | ||
shutil.copy(dist, final) | ||
|
||
|
||
def build_dists(session: nox.Session) -> List[str]: | ||
"""Return dists with valid metadata.""" | ||
session.log( | ||
"# Check if there's any Git-untracked files before building the wheel", | ||
) | ||
|
||
has_forbidden_git_untracked_files = any( | ||
# Don't report the environment this session is running in | ||
not untracked_file.startswith(".nox/build-release/") | ||
for untracked_file in release.get_git_untracked_files() | ||
) | ||
if has_forbidden_git_untracked_files: | ||
session.error( | ||
"There are untracked files in the working directory. " | ||
"Remove them and try again", | ||
) | ||
|
||
session.log("# Build distributions") | ||
session.run("python", "-m", "build", silent=True) | ||
produced_dists = glob.glob("dist/*") | ||
|
||
session.log(f"# Verify distributions: {', '.join(produced_dists)}") | ||
session.run("twine", "check", *produced_dists, silent=True) | ||
|
||
return produced_dists | ||
|
||
|
||
@nox.session(name="upload-release") | ||
def upload_release(session: nox.Session) -> None: | ||
version = release.get_version_from_arguments(session) | ||
if not version: | ||
session.error("Usage: nox -s upload-release -- YY.N[.P]") | ||
|
||
session.log("# Install dependencies") | ||
session.install("twine") | ||
|
||
distribution_files = glob.glob("dist/*") | ||
session.log(f"# Distribution files: {distribution_files}") | ||
|
||
# Sanity check: Make sure there's 2 distribution files. | ||
count = len(distribution_files) | ||
if count != 2: | ||
session.error( | ||
f"Expected 2 distribution files for upload, got {count}. " | ||
f"Remove dist/ and run 'nox -s build-release -- {version}'" | ||
) | ||
# Sanity check: Make sure the files are correctly named. | ||
distfile_names = (os.path.basename(fn) for fn in distribution_files) | ||
expected_distribution_files = [ | ||
f"pip-{version}-py3-none-any.whl", | ||
f"pip-{version}.tar.gz", | ||
] | ||
if sorted(distfile_names) != sorted(expected_distribution_files): | ||
session.error(f"Distribution files do not seem to be for {version} release.") | ||
|
||
session.log("# Upload distributions") | ||
session.run("twine", "upload", *distribution_files) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's best to catch this early:
I think
pypi-publish
does a strict check so having it less strict might reveal unpleasant surprised too late in the process.