-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
uv
with pytest
and filterwarnings
triggers a host of SyntaxWarning
s as errors; pip-tools
does not
#1559
Comments
uv
with pytest
and filterwarnings
is triggering a host of SyntaxWarning
s as errorsuv
with pytest
and filterwarnings
triggers a host of SyntaxWarning
s as errors; pip-tools
does not
One thing I notice is that in Test 4, you're using a different Python version (Python 3.11.7, while the rest use Python 3.10.13). That might be why it's an error, and not a warning? |
I've gone and cleaned up my tests. It is reproducible now. Please see But for full disclosure, here is my full run starting with
|
Is this a duplicate of #1928? |
It looks like the same thing to me |
Add a `--compile` option to `pip install` and `pip sync`. I chose to implement this as a separate over the entire venv. If we wanted to compile during installation, we'd have to make sure that writing is exclusive, to avoid concurrent processes writing broken `.pyc` files. Additionally, this ensures that the entire site-packages are bytecode compiled, even if there are packages that aren't from this `uv` invocation. The disadvantage is that we do not update RECORD and rely on this comment from [PEP 491](https://peps.python.org/pep-0491/): > Uninstallers should be smart enough to remove .pyc even if it is not mentioned in RECORD. If this is a problem we can change it to run during installation and write RECORD entries. Internally, this is implemented as an async work-stealing subprocess worker pool. The producer is a directory traversal over site-packages, sending each `.py` file to a bounded async FIFO queue/channel. Each worker has a long-running python process. It pops the queue to get a single path (or exists if the channel is closed), then sends it to stdin, waits until it's informed that the compilation is done through a line on stdout, and repeat. This is fast, e.g. installing `jupyter plotly` on Python 3.12 it processes 15876 files in 319ms with 32 threads (vs. 3.8s with a single core). The python processes internally calls `compileall.compile_file`, the same as pip. Like pip, we ignore and silence all compilation errors (#1559). There are 10s and 1s timeouts to ensure we don't get stuck when the python subprocess doesn't work properly or there was a panic breaking tokio. For the reviewers, please check if i missed any spots where we could deadlock, this is the hardest part of this PR. I still have to check that windows works. I don't think we want an option to compile the seed packages in ihe `venv` subcommand, do we? Fixes #1788 Closes #1559 Closes #1928
Add a `--compile` option to `pip install` and `pip sync`. I chose to implement this as a separate over the entire venv. If we wanted to compile during installation, we'd have to make sure that writing is exclusive, to avoid concurrent processes writing broken `.pyc` files. Additionally, this ensures that the entire site-packages are bytecode compiled, even if there are packages that aren't from this `uv` invocation. The disadvantage is that we do not update RECORD and rely on this comment from [PEP 491](https://peps.python.org/pep-0491/): > Uninstallers should be smart enough to remove .pyc even if it is not mentioned in RECORD. If this is a problem we can change it to run during installation and write RECORD entries. Internally, this is implemented as an async work-stealing subprocess worker pool. The producer is a directory traversal over site-packages, sending each `.py` file to a bounded async FIFO queue/channel. Each worker has a long-running python process. It pops the queue to get a single path (or exists if the channel is closed), then sends it to stdin, waits until it's informed that the compilation is done through a line on stdout, and repeat. This is fast, e.g. installing `jupyter plotly` on Python 3.12 it processes 15876 files in 319ms with 32 threads (vs. 3.8s with a single core). The python processes internally calls `compileall.compile_file`, the same as pip. Like pip, we ignore and silence all compilation errors (#1559). There are 10s and 1s timeouts to ensure we don't get stuck when the python subprocess doesn't work properly or there was a panic breaking tokio. For the reviewers, please check if i missed any spots where we could deadlock, this is the hardest part of this PR. I still have to check that windows works. I don't think we want an option to compile the seed packages in ihe `venv` subcommand, do we? Fixes #1788 Closes #1559 Closes #1928
Gonna merge into that issue. |
Add a `--compile` option to `pip install` and `pip sync`. I chose to implement this as a separate pass over the entire venv. If we wanted to compile during installation, we'd have to make sure that writing is exclusive, to avoid concurrent processes writing broken `.pyc` files. Additionally, this ensures that the entire site-packages are bytecode compiled, even if there are packages that aren't from this `uv` invocation. The disadvantage is that we do not update RECORD and rely on this comment from [PEP 491](https://peps.python.org/pep-0491/): > Uninstallers should be smart enough to remove .pyc even if it is not mentioned in RECORD. If this is a problem we can change it to run during installation and write RECORD entries. Internally, this is implemented as an async work-stealing subprocess worker pool. The producer is a directory traversal over site-packages, sending each `.py` file to a bounded async FIFO queue/channel. Each worker has a long-running python process. It pops the queue to get a single path (or exists if the channel is closed), then sends it to stdin, waits until it's informed that the compilation is done through a line on stdout, and repeat. This is fast, e.g. installing `jupyter plotly` on Python 3.12 it processes 15876 files in 319ms with 32 threads (vs. 3.8s with a single core). The python processes internally calls `compileall.compile_file`, the same as pip. Like pip, we ignore and silence all compilation errors (#1559). There is a 10s timeout to handle the case when the workers got stuck. For the reviewers, please check if i missed any spots where we could deadlock, this is the hardest part of this PR. I've added `uv-dev compile <dir>` and `uv-dev clear-compile <dir>` commands, mainly for my own benchmarking. I don't want to expose them in `uv`, they almost certainly not the correct workflow and we don't want to support them. Fixes #1788 Closes #1559 Closes #1928
I've made a repo to demonstrate this, as it tests a few setups.
From the
README
:Unfortunately, right after writing this up, I can no longer trigger the bug!
I'll continue to do so but I felt I've done enough testing to warrant writing an issue. Maybe the Astral team can already see what's going on.
uv
version 0.1.2.The text was updated successfully, but these errors were encountered: