Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce client runtime #193

Open
jku opened this issue Jan 24, 2025 · 4 comments
Open

reduce client runtime #193

jku opened this issue Jan 24, 2025 · 4 comments
Labels
enhancement New feature or request

Comments

@jku
Copy link
Member

jku commented Jan 24, 2025

Based on #188 I started to look at why the test suite feels so slow:

  • I'm looking at test_verify_cpython_release_bundles (just because it contains most tests) using the selftest client on my laptop
  • each subtest takes ~0.9 seconds (there's significant variance though)
  • I've not profiled sigstore-python startup but I estimate that about 0.4 seconds of that is module imports as sigstore --help takes > 0.4 seconds.
  • The remaining 0.5 seconds seems related to TUF as most of it disappears if client uses --offline and that does make some sense: there is going to be a minimum of one TLS handshake and two requests here without --offline

So a significant issue is that we start 1700 client processes during the test: this likely means 1700 TLS handshakes and 3400 HTTP requests. there seems to be a couple of possible improvements:

  • sigstore client improvement: clients (like sigstore-python) could start keeping track of TUF update times and only update trusted_root.json if it's been more than (for example) 30 mins since last update
  • conformance client improvements: e.g. sigstore-python-conformance could use --offline argument (we would just have to make sure the client TUF cache is hot enough, this could be done by running the client once before the test suite runs)
@jku jku added the enhancement New feature or request label Jan 24, 2025
@jku
Copy link
Member Author

jku commented Jan 24, 2025

Any further improvements, in other words getting the subtest runtime < 0.4 seconds, would require some sort of IPC setup that avoids the startup cost (python module import for the selftest case) 1700 times but this seems like a lot of work

@segiddins
Copy link
Member

I thinking moving to IPC is actually the right move.
It's how, for example, the protobuf conformance suite works.

The test harness could feed in one command per line, for example, and a client could even do an exec per command if it wanted to, but it would open up the possibility of keeping a warmed up, potentially JITed process running, instead of the high level of overhead inherent in process spawn we're locked into right now

@woodruffw
Copy link
Member

Yep, agreed about IPC being a long-term requirement (at least, certainly for clients that have beefy runtimes e.g. anything on JVM).

Another thing we can do here for the runtime itself (but not necessarily the resources needed) is use xdist to parallelize the tests. That should result in faster runs, although potentially at the cost of a lot more parallel HTTPS traffic.

(One blocker for parallelization is that we currently use subtests for the CPython bundle tests, and subtests don't compose with xdist. But it wouldn't be very hard to remove subtests in favor of parametrization and parallelize from there.)

@jku
Copy link
Member Author

jku commented Feb 25, 2025

So a significant issue is that we start 1700 client processes during the test: this likely means 1700 TLS handshakes and 3400 HTTP requests. there seems to be a couple of possible improvements:

As an update on this:

  • the analysis was partly flawed: more than 50% of the 1700 subtest were no-ops: cpython artifacts without sigstore signatures were counted as subtests but did not lead to any requests
  • refactor test_verify_cpython_release_bundles #195 reduced the number of actual subtests to ~70. This number will keep slowly growing as cpython releases are made (but we can start testing only every other release or something if this is still an issue)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants