How many of you are publishing python packages to registries other than PyPI? I'm curious as I've only ever published to PyPI.
At my company we're using GCP Artifact Registry for an internal PyPI proxy as well as to publish (proprietary) Python wheels to.
I've used https://github.com/gorilla-co/s3pypi or some variant of it in a few places. Basically cloudfront + s3 instead of a dedicated artifact registry.
I've been doing it to Azure (private package repo) since their instructions were ok. I would like some of these to publish instructions for uv as well (instead of just pip/twine).
At my job, we use AWS CodeArtifact to host a couple dozen internal libraries we use for Python and TypeScript projects. I suspect that this is a common use case for these kinds of artifact repositories.
what's your experience with AWS CodeArtifact? We are migrating to AWS and we are in doubt about using it or our internal Nexus server.
To access a private CodeArtifact repository, you have to first fetch a short-lived token, then supply that as the password when you access it via npm/yarn, poetry, etc. In most cases, this is an inconvenience that can mostly be paved over with the AWS CLI or a shell alias.
This quickly get messy though. We use AWS CDK and build our assets in a Docker container. Each time the token changes, Docker invalidates a bunch of layers and rebuilds the image. AWS CDK sees that and uploads a new .zip to S3 or an image to ECR. Then Security Hub sees a new Lambda function or image, scans it, and carpet bombs my email whenever a CVE is found.
It's ... not ideal.
I've published them to Artifactory's internal pypi, and also to Gemfury. It all works decently.