-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Makefile updates and tools for requirements.txt and wheels #23
Conversation
Please remember that the |
Rebasing on latest master (5411e3f) to review. |
Makefile
Outdated
.PHONY: build-wheels | ||
build-wheels: syncwheels ## Builds the wheels and syncs to the localwheels directory | ||
./scripts/build-sync-wheels -r ${PKG_DIR}/requirements.txt | ||
./scripts/build-sync-wheels -r ${PKG_DIR}/requirements-dev.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we don't need this right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we don't need this right?
The build-wheels action is to help us build all the dependencies into wheels. We do need this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry for lack of clarity, i meant we don't need the dev requirements files specifically, since we're not using them in the build
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry for lack of clarity, i meant we don't need the dev requirements files specifically, since we're not using them in the build
Yup, and I removed that file. I forgot to update the comment here :(
Makefile
Outdated
|
||
.PHONY: build-wheels | ||
build-wheels: syncwheels ## Builds the wheels and syncs to the localwheels directory | ||
./scripts/build-sync-wheels -r ${PKG_DIR}/requirements.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so we don't need the requirements*.txt files in any of the proxy or client repos if this is merged, correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or do we use this script to generate those files and then commit them in the upstream repository? we should document this in the README if so
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
requirements*.txt files will be stored in the git and also be part of the source release. The pip will use these files to do the final install for the debian package. I will update the guide and readme for this.
scripts/create-requirements
Outdated
|
||
# Now create both of the files | ||
pipenv lock -r > requirements.txt | ||
pipenv lock -d -r > requirements-dev.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove this line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are keeping both of these files. Now as I understood, you want to keep/track those development dependencies in our PyPI/setup at all, I can remove these then.
scripts/sync-sha256sums
Outdated
set -o pipefail | ||
|
||
cd ./localwheels/ | ||
sha256sum * > ../sha256sums.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's document the fact that this file should be signed to the README
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup, I will do that.
scripts/update-requirements
Outdated
|
||
# First remove index line and any PyQt or sip dependency | ||
cleanup(req_path) | ||
cleanup(reqdev_path) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay.
scripts/update-requirements
Outdated
|
||
# Now let us update the files along with the sha256sums from localwheels | ||
add_sha256sums(req_path) | ||
add_sha256sums(reqdev_path) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove (this applies to any dev requirements file)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be done.
Those steps should absolutely be automated—the "syncwheels" terminology presently in the repo should perhaps be called "fetch-wheels", since it doesn't actually synchronize them. Detailed instructions for syncing the wheels can be found over in freedomofpress/securedrop-debian-packaging-guide#6 ; those AWS S3 commands should absolutely be pulled into this repo's README, and ideally tooling, in order to bridge gaps in the dev workflow. |
scripts/create-requirements
Outdated
|
||
# Now create both of the files | ||
pipenv lock -r > requirements.txt | ||
pipenv lock -d -r > requirements-dev.txt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having the requirements.txt files generated (and later committed) in a separate repo strikes me as a bad pattern. Let's instead pull those generated reqs files into this repo, and commit them here.
Those reqs files also must have hashes included (from the Pipfile.lock is fine).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having the requirements.txt files generated (and later committed) in a separate repo strikes me as a bad pattern. Let's instead pull those generated reqs files into this repo, and commit them here.
The requirements.txt
is part of the actual Python project and all Python projects should include one. This repository is only providing tooling+files related to debian packaging.
So the python wheel builder steps are:
Important to add as discussed this morning: This should then bail if the SHA256 hashes (of the tarballs only) are not matching what is in Pipfile.lock in the project to be packaged
Important to add:
These steps, if you agree (if not please correct them when you add to the README) we should add to the README. |
The other item we need to add is logic to scripts/build-debianpackage to bail if the signature on the document that contains the SHA256sums of the built wheels does not verify |
967102f
to
0f7edbd
Compare
These scripts will help a project maintainer to generate right requirements.txt and requirements-dev.txt files.
There are few new commands make requirements: to create requirements files for a project make build-wheels: to create and sync locally new wheels
0f7edbd
to
13d5907
Compare
We first download abnd verify the source tarballs based on the upstream PyPI hashes (from Pipfile.lock). Then when verify the sha256sum.txt file (gpg verify) and then using that file, we update the hashes in the requirements.txt file. Finally before the debian package build, we again gpg verify the sha256sums.txt file, and check the requirements.txt file and each of the hash values inside of it in the sha256sums.txt file.
13d5907
to
ebbfa90
Compare
Makefile
Outdated
./scripts/update-requirements | ||
|
||
.PHONY: build-wheels | ||
build-wheels: syncwheels ## Builds the wheels and syncs to the localwheels directory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at this step we need to dump the current shasums from what we just downloaded from S3 to a new sha256sums-downloaded.txt file, and then verify the sig on the existing sha256sums.txt, and diff the two files. Or we can end up signing the hash of a compromised wheel from S3
The operation is strictly one-way, downloading the remote wheels from S3 and storing them in the `localwheels/` dir inside the packaging repository. For clarity, renaming the action to distinguish it from the *other* one-way action, pushing new wheels up to S3 (not yet codified). Also implements an optimization on the fetching operation, via the `--continue-at` flag to curl. Essentially the files will not be redownloaded if they exist locally. It's OK to skip a full download, since we'll be verifying integrity of the contents of the S3 buckets via a SHA256SUMS file, with detached sig, inside this packaging repo.
This option is used by default if there are hashes with the dependencies in the requirements.txt file. We specify it anyway to guard against use of a requirements.txt file without hashes.
The sha256sums file is used to verify the integrity of assets pulled in from S3; in order to trust the contents of the sha256sums file itself, we'll verify it via a detached, ASCII-armored GPG signature, stored in this packaging repository. We're creating a temporary keyring to perform the verification, otherwise gpg will report success if *any* key in the local keyring was used to create the detached sig; that's not what we want. Instead, only a small list of pubkeys should be considered valid, so we'll store those pubkeys here. Sprinkles the new script logic in a few locations, where we want to check the integrity; pending review, we may revise the locations of these calls out to the detached sig verification.
Manually checking for the local file and skipping if found. As comment indicates, `--continue-at -` doesn't work with S3.
Since we have a detached sig file on the SHA256SUMS list, we can check the assets fetched from S3 to ensure they match expectations.
Must come after the `--no-binary` option, since `:all:` is a `format_control` arg to `--no-binary`. Simple reorder in the command concatenation fixes.
ae79e1c
to
2e5220c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested locally and approving based on these latest changes
This PR has a few new scripts and related Makefile update. @conorsch I am guessing you will update the Makefile bit to make a bit more professional :)
These scripts and Makefile targets will help our package maintainers to create
requirements.txt
andrequirements-dev.txt
files for the actual release. Those files will contain the sha256sums for the binary wheels from our PyPI (the s3 bucket).How to test?
First, let us the create/update the files, requires
PKG_DIR
environment variable.This command will update the requirements/versions from the upstream PyPI. If there is any new update or if any of the new binary wheels are missing, this will fail. It will also print the next command a developer can use to generate the missing wheels.
If such missing wheels are there (you can delete any related wheel and test this), the above mentioned command will create such new wheels.
After this, remember to sync the
localwheels
directory to the s3, and also update the HTML index files and also thesha256sums.txt
files as required. Also remember to deploy any new wheel+index to the s3 bucket.The
sha256sums.txt
should always get new entries, but, there should not be any update of the any existing wheels or hashes. If there is any such change, it means one did a rebuild of the wheel which is already there in our s3/PyPI index. So, from the security aspect, we will have to check that in the CI for each commit.