Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added trusted PyPI publishing #7616

Merged
merged 10 commits into from Dec 26, 2023
Merged

Added trusted PyPI publishing #7616

merged 10 commits into from Dec 26, 2023

Conversation

radarhere
Copy link
Member

Helps #7390

.github/workflows/wheels.yml Outdated Show resolved Hide resolved
.github/workflows/wheels.yml Outdated Show resolved Hide resolved
@hugovk
Copy link
Member

hugovk commented Dec 14, 2023

Very nice!

Looks like:

image


I've been using trusted publishing in my other projects (although not with environments).

I recommend we also set up trusted publishing to TestPyPI, so we know the whole pipeline is working before the big release day.

I've already registered Pillow at https://test.pypi.org/project/pillow/ and have set up trusted publishers on that side:

image


One sticking point:

You can't publish a file to [Test]PyPI that has already been published there, unless the version number changes. But we have a hardcoded value in src/PIL/_version.py.

One way is to use https://github.com/pypa/setuptools_scm/ to generate a version number based on the "distance"/number of commits since the last tag. We'd also need local_scheme = "no-local-version", it removes a +abc123 suffix that PyPI doesn't like, and also need some juggling to get the version to the C code. This could be a separate PR.

@hugovk hugovk added the Release label Dec 14, 2023
Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
@radarhere
Copy link
Member Author

You can't publish a file to [Test]PyPI that has already been published there, unless the version number changes. But we have a hardcoded value in src/PIL/_version.py.

To make sure I'm following - this is only a problem with test.pypi.org, because we may want to submit the same version more than once, yes? You're not suggesting that anything about the normal version mechanism needs to be changed.

@hugovk
Copy link
Member

hugovk commented Dec 15, 2023

You can't publish a file to [Test]PyPI that has already been published there, unless the version number changes. But we have a hardcoded value in src/PIL/_version.py.

To make sure I'm following - this is only a problem with test.pypi.org, because we may want to submit the same version more than once, yes?

Yes and no :)

No, both TestPyPI and PyPI behave the same in this regard.

The restriction really is that filenames cannot be re-used. So we can re-use version numbers, but need to

For example, we have a Pillow-10.1.0-cp312-cp312-macosx_10_10_x86_64.whl at https://pypi.org/project/Pillow/10.1.0/#files. If there was something wrong with that file and we needed to upload a new one, we can't upload a file with the same name, PyPI will reject it with an error. That's why we've needed to use wheel build tags in the past, so we have a unique file we can upload:

And so yes, we may submit the same version more than once, but need to make it unique in other ways.


You're not suggesting that anything about the normal version mechanism needs to be changed.

Correct.

For normal releases, we still have the same version numbers.

This would give us new version numbers for testing uploads of intra-release builds to TestPyPI.

If we look at https://github.com/hugovk/norwegianblue (now uses Hatchling+hatch-vcs, but used to use setuptools+setuptools_scm and both work in the same way (I think hatch-vcs uses setuptools_scm under the hood).


You're not suggesting that anything about the normal version mechanism needs to be changed.

Although there would be one change:

Right now, after we release 10.2.0, we bump the hardcoded version to 10.3.0.dev0.

With setuptools_scm, we'd still be getting a version like 10.3.0.dev1, 10.3.0.dev200, etc, because it's based on the last tag.

I think that's fine, it's an internal change.

This does however bring to mind a potential blocker: we do compile the version into the C layer, and error if they're different. With setuptools_scm, the devX is incremented for every commit, which could result in mismatches or needed to rebuild C more often.. This could be a blocker, and we may need another solution to give a unique filename for TestPyPI...

@radarhere
Copy link
Member Author

we do compile the version into the C layer, and error if they're different. With setuptools_scm, the devX is incremented for every commit, which could result in mismatches

We could relax the condition to allow 10.3.0.dev1 or 10.3.0.dev200, by changing

if __version__ != getattr(core, "PILLOW_VERSION", None):

to

    if not hasattr(core, "PILLOW_VERSION") or re.sub(
        "dev[0-9]+", "dev0", __version__
    ) != re.sub("dev[0-9]+", "dev0", core.PILLOW_VERSION):

@hugovk
Copy link
Member

hugovk commented Dec 15, 2023

Yep, that could work.

@radarhere
Copy link
Member Author

I've just registered for test.pypi.org with the same username. Would you mind adding me as a maintainer to Pillow?

@hugovk
Copy link
Member

hugovk commented Dec 18, 2023

Invited!

@radarhere
Copy link
Member Author

Thanks.

I'm wondering if the use of setuptools-scm to generate unique version numbers each time is overkill? Once we have it working, isn't it unlikely to break in the future? Meaning that for whatever infrequent testing we need to do, we could just branch off from main and manually increment the version to dev1?

@hugovk
Copy link
Member

hugovk commented Dec 18, 2023

If possible, I'd prefer to automate testing where possible rather than do it manually. Doing it manually means we have to foresee a change might cause problems in the release pipeline. And as said, we can consider that in a separate PR, and can consider other approaches too.

Comment on lines 216 to 229
- uses: actions/download-artifact@v4
- name: Merge artifacts
run: |
for artifact in macOS_x86_64 \
macOS_arm64 \
manylinux2014_musllinux \
manylinux_2_28 \
windows_x86 \
windows_x64 \
windows_arm64 \
sdist; do
mv $artifact/* .
rmdir $artifact
done
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's try something like scikit-build/cmake-python-distributions#438 / scientific-python/cookie#335 to avoid repeating the names:

Suggested change
- uses: actions/download-artifact@v4
- name: Merge artifacts
run: |
for artifact in macOS_x86_64 \
macOS_arm64 \
manylinux2014_musllinux \
manylinux_2_28 \
windows_x86 \
windows_x64 \
windows_arm64 \
sdist; do
mv $artifact/* .
rmdir $artifact
done
- uses: actions/download-artifact@v4
with:
path: all
- name: Merge files
run: |
mkdir dist
mv all/*/* dist

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I've pushed something similar.

.github/workflows/wheels.yml Outdated Show resolved Hide resolved
with:
name: dist
name: ${{ matrix.artifact_name }}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we re-use existing matrix variables instead of introducing a new artifact_name?

The tricky bit might be differentiating between these two:

          - name: "manylinux2014 and musllinux x86_64"
            os: ubuntu-latest
            archs: x86_64
            artifact_name: "manylinux2014_musllinux"
          - name: "manylinux_2_28 x86_64"
            os: ubuntu-latest
            archs: x86_64
            build: "*manylinux*"
            manylinux: "manylinux_2_28"
            artifact_name: "manylinux_2_28"

If we do wheel-${{ matrix.os }}-${{ matrix.archs }}-${{ matrix.manylinux }}, does that error or do we get something like this?

  • wheel-ubuntu-latest-x86_64
  • wheel-ubuntu-latest-x86_64-manylinux_2_28

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've pushed a commit to use ${{ matrix.os }}-${{ matrix.archs }}-${{ matrix.manylinux }}.

We get

  • macos-latest-x86_64-
  • macos-latest-arm64-
  • ubuntu-latest-x86_64-
  • ubuntu-latest-x86_64-manylinux_2_28

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Since my last suggestion, and based on lots of feedback about the breaking @v4 changes, GitHub have just added pattern: and merge-multiple: inputs:

actions/upload-artifact#472 (comment)

Shall we try that? We'll need a common prefix/pattern for the sdist and all the wheel artifacts.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for finding that. I've updated the PR.

@webknjaz
Copy link

Moreover, the "gate" was implemented incorrectly: https://github.com/marketplace/actions/alls-green#why.

with:
name: fribidi
name: fribidi_${{ matrix.arch }}
path: fribidi\*

sdist:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest building wheels from sdist? This is what pip does — it doesn't build wheels straight from Git source, but from the tarballs. And this is what pypa/build does if you don't pass any args. Doing so will ensure that sdists are actually usable, as a smoke test.

Here's an example: https://github.com/aio-libs/yarl/actions/runs/7258064074/workflow#L124.

P.S. cibuildwheel accepts a path to the sdist tarball natively (in place of a directory path) so integrating this into the wheel build jobs is as easy as downloading the artifact and feeding the sdist from it into the cibuildwheel action.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, and that repo also features the wheel build jobs moved into a reusable workflow and called with different params.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I suggest building wheels from sdist?

We're currently using the .ci directory in the build process, which isn't included in the sdist

- name: Build wheels
run: |
python3 -m pip install -r .ci/requirements-cibw.txt

Sure, we could checkout the repository AND use sdist - but that doesn't sound very neat to me.

I know you said that cibuildwheel can simply accept an sdist path, but we're also currently adding in extra test images, which wouldn't be quite so simple when using a compressed sdist.

- name: Checkout extra test images
uses: actions/checkout@v4
with:
repository: python-pillow/test-images
path: Tests\test-images

Oh, and that repo also features the wheel build jobs moved into a reusable workflow

If wheels.yml gets much longer, I could see that being nice, but at the moment, I don't think it's problematic.

Also, neither of these ideas depend on this PR. If you would like to see them become part of Pillow, you could create a separate PR.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We're currently using the .ci directory in the build process, which isn't included in the sdist

And this is a reason to include it. Downstreams might want to run tests using the sdist as the source.

If wheels.yml gets much longer, I could see that being nice, but at the moment, I don't think it's problematic.

For me, it's a structural thing that helps layer components. Not necessarily related to how big the YAML files are (although, that's also important).

Also, neither of these ideas depend on this PR. If you would like to see them become part of Pillow, you could create a separate PR.

Sure. These are mostly drive-by comments I made having seen some familiar faces and tech involved.

@nulano
Copy link
Contributor

nulano commented Dec 19, 2023

Moreover, the "gate" was implemented incorrectly: https://github.com/marketplace/actions/alls-green#why.

The argument being that a "skipped" status is incorrectly accepted, yes?

This does not appear to be the case for Mergify, which is the only use of these gates here. A recent PR, #7608, only updated documentation and so most of the gates were skipped. However, the Mergify report did not accept these skips as a success: https://github.com/python-pillow/Pillow/runs/19444375622

@webknjaz
Copy link

webknjaz commented Dec 19, 2023

Yeah, you're right. Mergify treats these statuses better. But overall, on the GH platform level, it might still be reasonable to implement them correctly so that somebody doesn't trip over this implementation quirk in the future.

with:
name: dist
name: ${{ matrix.os }}-${{ matrix.archs }}-${{ matrix.manylinux }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
name: ${{ matrix.os }}-${{ matrix.archs }}-${{ matrix.manylinux }}
name: ${{ matrix.os }}-${{ matrix.archs }}${{ matrix.manylinux && '-' }}${{ matrix.manylinux }}

Would get rid of the extra trailing -; not sure if it's worth sacrificing a bit of readability to make the output names cleaner?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could also be something like

Suggested change
name: ${{ matrix.os }}-${{ matrix.archs }}-${{ matrix.manylinux }}
name: >-
${{
format(
'{0}-{1}{2}{3}',
matrix.os,
matrix.archs,
(matrix.manylinux && '-' || ''),
matrix.manylinux
)
}}

(not tested)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My preference would be in the middle, with

${{ matrix.os }}-${{ matrix.archs }}${{ matrix.manylinux && format('-{0}', matrix.manylinux) }}

but I have a feeling that would make no one happy.

Copy link
Contributor

@nulano nulano Dec 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My preference would be in the middle, with [...] but I have a feeling that would make no one happy.

Actually, that looks like the best option to me also.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, thanks, I've pushed a commit for that.

radarhere and others added 2 commits December 24, 2023 13:37
Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
@hugovk
Copy link
Member

hugovk commented Dec 25, 2023

Notice: Attempting to perform trusted publishing exchange to retrieve a temporary short-lived API token for authentication against https://test.pypi.org/legacy/ due to __token__ username with no supplied password field
Error: Trusted publishing exchange failure: 
OpenID Connect token retrieval failed: GitHub: missing or insufficient OIDC token permissions, the ACTIONS_ID_TOKEN_REQUEST_TOKEN environment variable was unset

This generally indicates a workflow configuration error, such as insufficient
permissions. Make sure that your workflow has `id-token: write` configured
at the job level, e.g.:

permissions:
  id-token: write

Learn more at https://docs.github.com/en/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#adding-permissions-settings.

https://github.com/python-pillow/Pillow/actions/runs/7324548352/job/19948625585

But we have that at the job level. Maybe it's because we have an env configured at TestPyPI. I'll remove that, and restart the job.

@hugovk
Copy link
Member

hugovk commented Dec 25, 2023

Added the second one, and left the first one:

image

And restarted...

@hugovk
Copy link
Member

hugovk commented Dec 25, 2023

Hmm, same thing. Removed the first one and restarted...

@hugovk
Copy link
Member

hugovk commented Dec 25, 2023

Nope, same thing. Perhaps it's as @webknjaz said in one thread, you can't publish from a PR.

@nulano
Copy link
Contributor

nulano commented Dec 25, 2023

PRs don't have access to most secrets (otherwise they could be trivially extracted by a malicious PR). From https://docs.github.com/en/actions/security-guides/using-secrets-in-github-actions#using-secrets-in-a-workflow:

With the exception of GITHUB_TOKEN, secrets are not passed to the runner when a workflow is triggered from a forked repository.

And GITHUB_TOKEN cannot have write permissions by default. From https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#changing-the-permissions-in-a-forked-repository:

You can use the permissions key to add and remove read permissions for forked repositories, but typically you can't grant write access. The exception to this behavior is where an admin user has selected the Send write tokens to workflows from pull requests option in the GitHub Actions settings. For more information, see "Managing GitHub Actions settings for a repository."

If I understand this correctly, the required setting can only be enabled for private repositories (where only members of the organization can create PRs).

@webknjaz
Copy link

Nope, same thing. Perhaps it's as @webknjaz said in one thread, you can't publish from a PR.

Yes, PR privileges are lower. At least, when the PRs are coming from forks. Some of them can be addressed through pull_request_target. But that's its own can of worms.

@webknjaz
Copy link

Although the publish action uses GITHUB_TOKEN, that token cannot have write permissions by default.

No, my action doesn't access that token. GHA sets two special env vars for OIDC when id: write is set and is allowed in the given context.

@webknjaz
Copy link

Hmm, same thing. Removed the first one and restarted...

I recommend keeping the environment. Just make sure it matches what you set up on the PyPI side. I suggest using pypi for PyPI and testpypi for TestPyPI in my guides and tutorials. These represent the targets rather accurately.

@webknjaz
Copy link

Perhaps we should spell it out in the error message that GH does allow OIDC in untrusted event contexts, like pull_request.

cc @woodruffw

@nulano
Copy link
Contributor

nulano commented Dec 26, 2023

No, my action doesn't access that token. GHA sets two special env vars for OIDC when id: write is set and is allowed in the given context.

That's what I thought, but I got confused by the GHA documentation.
Not that it matters, but I guess those envvars are automatically extracted(?) from the GITHUB_TOKEN by the runner?

@woodruffw
Copy link

woodruffw commented Dec 26, 2023

Just to provide some disambiguation:

  • Trusted publishing allows publishing from a pull_request event. More generally: as long as a workflow has id-token: write, trusted publishing will work from it.
  • GitHub itself does not give id-token: write to pull_request events run from forks, even if the workflow explicitly adds that permission. This isn't super well documented anywhere, but makes sense from a security perspective (I believe this is the same for all other write permissions, to prevent 3p forks from making destructive changes to the "target" repo by default).

So TL;DR: you can publish to PyPI from a pull_request event, but only if the pull_request event is coming from a first-party branch (not from a fork). This limitation persists even if the fork is controlled or pushed to by an owner of the upstream repository.

Edit: For some additional context on this GitHub behavior, see sigstore/sigstore-conformance#55 -- it's a different project, but the outcome (can't get OIDC tokens from 3p PRs) is the same.

@woodruffw
Copy link

Perhaps we should spell it out in the error message that GH does allow OIDC in untrusted event contexts, like pull_request.

IMO this is likely to cause some confusion, although I think there's a variant we can chip out per #7616 (comment): if the publish is running in a pull_request event and we detect that it's a branch from a fork, then we can produce a more useful error message, like:

This error indicates that you're attempting to obtain an OpenID Connect token from a fork of a repository, which GitHub does not permit.

<link to another troubleshooting section>

@hugovk
Copy link
Member

hugovk commented Dec 26, 2023

Thanks for clearing that up! That's all fine, we don't want to deploy from PRs in normal use, it was for testing.

So I think we can merge this pretty much as-is, test from main, then cleanup TestPyPI -> PyPI later.

Another option: @radarhere, if you like, we could try things out from your fork. Feel free to add a new publisher for that at: https://test.pypi.org/manage/project/pillow/settings/publishing/

Our next quarterly release is in just a week, 2nd January, so I'd prefer to just get something merged and working first to minimise risk. This will also be the first release with cibuildwheel, plus replacing setup.cfg with pyproject.toml, so we have a few big changes.

@radarhere
Copy link
Member Author

radarhere commented Dec 26, 2023

Ok, sure. I added a publisher, and merged this to my main.

There was already a partial release for 10.2.0.dev0, so I updated the version to 10.2.0.dev1 and it worked - https://github.com/radarhere/Pillow/deployments / https://test.pypi.org/manage/project/pillow/release/10.2.0.dev1/

@hugovk
Copy link
Member

hugovk commented Dec 26, 2023

Great! Did you test environments?

Let's merge this, and can you arrange with a repo admin to set up what is needed in this repo, and on PyPI? Or for them to give you and me access to do it.

Co-authored-by: Ondrej Baranovič <nulano@nulano.eu>
@radarhere
Copy link
Member Author

I tested it with an environment, yes.

https://github.com/radarhere/Pillow/blob/9078f85a55090a7c4bca6c304b4b0d480de72f13/.github/workflows/wheels.yml#L187-L207

  pypi-publish:
    # TEMP for testing
    # if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
    needs: [build, windows, sdist]
    runs-on: ubuntu-latest
    name: Upload release to PyPI
    permissions:
      id-token: write
    environment:
      name: radarhere-test-pypi
      url: https://test.pypi.org/p/Pillow
    steps:
      - uses: actions/download-artifact@v4
        with:
          pattern: dist-*
          path: dist
          merge-multiple: true
      - name: Publish to PyPI
        uses: pypa/gh-action-pypi-publish@release/v1
        with:
          repository-url: https://test.pypi.org/legacy/

@radarhere radarhere merged commit 265ee32 into python-pillow:main Dec 26, 2023
15 of 16 checks passed
@radarhere radarhere deleted the pypi branch December 26, 2023 22:37
@webknjaz
Copy link

@woodruffw that sounds reasonable!

webknjaz added a commit to pypa/gh-action-pypi-publish that referenced this pull request Feb 27, 2024
This specializes the token retrieval error handling, providing an
alternative error message when the error cause is something
that we know can't possibly work due to GitHub's own restrictions
on PRs from forks.

PR #203
Closes #202
Ref python-pillow/Pillow#7616

Co-authored-by: Sviatoslav Sydorenko <webknjaz@redhat.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants