I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.
I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.
If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!
And in case it wasn't clear to readers of your comment, uv run script.py creates an ephemeral venv and runs your script in that, so you don't pollute your system env or whatever env you happen to be in.
As I understand it, recent versions of PyTorch have made this process somewhat easier, so maybe it's worth another try.
uv sync --extra gpu
uv add matplotlib # the sync this runs undoes the --extra gpu
uv sync # oops also undoes all the --extra
What you have to do to avoid this is to remember to use --no-sync all the time and then meticulously manually sync while remembering all the extras that I do actually currently want: uv sync --extra gpu --extra foo --extra bar
uv add --no-sync matplotlib
uv sync --extra gpu --extra foo --extra bar
It's just so... tedious and kludgy. It needs an "extras.lock" or "sync.lock" or something. I would love it if someone tells me I'm wrong and missing something obvious in the docs.1. Create or edit the UV configuration file in one of these locations:
- `~/.config/uv/config.toml` (Linux/macOS)
- `%APPDATA%\uv\config.toml` (Windows)
2. Add a section for default groups to sync:
```toml
[sync]
include-groups = ["dev", "test", "docs"] # Replace with your desired group names
```
Alternatively, you can do something similar in pyproject.toml if you want to apply this to the repo:
```toml
[tool.uv]
sync.include-groups = ["dev", "test", "docs"] # Replace with your desired group names
```
What I am struggling with is what you get after following the Configuring Accelerators With Optional Dependencies example:
https://docs.astral.sh/uv/guides/integration/pytorch/#config...
Part of what that does is set up rules that prevent simultaneously installing cpu and gpu versions (which isn't possible). If you use the optional dependencies example pyproject.toml then this is what happens:
$ uv sync --extra cpu --extra cu124
Using CPython 3.12.7
Creating virtual environment at: .venv
Resolved 32 packages in 1.65s
error: Extras `cpu` and `cu124` are incompatible with the declared conflicts: {`project[cpu]`, `project[cu124]`}
And if you remove the declared conflict, then uv ends up with two incompatible sources to install the same packages from uv sync --extra cpu --extra cu124
error: Requirements contain conflicting indexes for package `torch` in all marker environments:
- https://download.pytorch.org/whl/cpu
- https://download.pytorch.org/whl/cu124
After your comment I initially thought that perhaps the extras might be rewritten as group dependencies somehow to use the ~/.config/uv/config.toml but according to the docs group dependencies are not allowed to have conflicts with each other and must be installable simultaneously (makes sense since there is an --all-groups flag). That is you must be able to install all group dependencies simultaneously.https://docs.astral.sh/uv/concepts/projects/dependencies/#pl...
Not sure if it's as granular as you might need
[tool.uv.sources] torch = [{ index = "pytorch-cu124", marker = "sys_platform == 'win32'" }]
For the curious, the format is codified here: https://peps.python.org/pep-0723/
Tried uv for the first time and it was down to seconds.
I cancelled one at 4 minutes before switching to uv and having it finish in a few seconds
Does uv work with Jupyter notebooks too? When I used it a while ago dependencies were really annoying compared to Livebook with that Mix.install support.
I once investigated whether this feature could be integrated into Mix as well, but it wasn't possible since hex.pm doesn't provide release timestamps for packages.
> Does uv work with Jupyter notebooks too?
Yes![2]
[1] https://docs.astral.sh/uv/guides/scripts/#improving-reproduc... [2] https://docs.astral.sh/uv/guides/integration/jupyter/
Having said that, there are 2 areas where we still need conda:
- uv doesn’t handle non-python wheels, so if you need to use something like mkl, no luck
- uv assumes that you want to use one env per project. However with complex projects you may need to use a different env with different branches of your code base. Conda makes this easy - just activate the conda env you want — all of your envs can be stored in some central location outside your projects — and run your code. Uv wants to use the project toml file and stores the packages in .venv by default (which you don’t want to commit but then need different versions of). Yes you can store your project venv elsewhere with an env var but that’s not a practical solution. There needs to be support for multiple .toml files where the location of the env can be specified inside the toml file (not in an env var).
If you are lucky, and you don't have to build them, because the exceptionally gifted person who packaged them didn't know how to distribute them and the bright minds running PyPI.org allowed that garbage to be uploaded and made it so pip would install that garbage by default.
> can replace pipx with "uv tool install,"
That's a stupid idea. Nobody needed pipx in the first place... The band-aid that was applied some years ago is now cast in stone...
The whole idea of Python tools trying to replace virtual environment, but doing it slightly better is moronic. The virtual environments is the band-aid. It needs to go. The Python developers need to be pressured into removing this garbage, and instead working on having program manifests or something similar. Python has virtual environments due to incompetence of its authors and unwillingness to make things right, once that incompetence was discovered.
----
NB. As it stands today, if you want to make your project work well, you shouldn't use any tools that install packages by solving dependencies and downloading them from PyPI. It's not the function of the tool doing that, it's the bad design of the index.
The reasonable thing to do is to install the packages (for applications) you need during development, figure out what you actually need, and then store the part you need for your package to work locally. Only repeat this process when you feel the need to upgrade.
If you need packages for libraries, then you need a way to install various permutations within allowed versions: no tool for package installation today knows how to do it. So, you might as well not use any anyways.
But, the ironic part is that nobody in Python community does it right. And that's why there are tons of incompatibilities, and the numbers increase dramatically when projects age even slightly.
I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)
The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.
UV is such a smooth UX that it makes you wonder how something like it wasn’t part of Python from the start.
…but we did have to wait for cargo, npm (I include yarn and pnpm here) and maybe golang to blaze the ‘this is how it’s done’ trail. Obvious in hindsight.
No sillier than the other various package managers.
I had to give up on mypy and move to pyright because mypy uses pip to install missing types and they refuse to support uv. In the CI pipeline where I use UV, I don't have a pip installed so mypy complains about missing pip.
Of course I can do it by myself by adding typing pkgs to requirement.txt file then what's the point of devtools! And I don't want requirements.txt when I already got pyproject.toml.
Once you get used to cargo from rust, you just can't tolerate shitty tooling anymore. I used to think pip was great (compared to C++ tooling).
I wouldn't get too used to it.
To maintain a successful fork, not only are you going to need to find people who volunteer for maintaining a fork at that scale (including a large user base due to popularity), you’ll need to find skilled Rust developers, too.
That’s going to be immensely difficult.
I have never used virtual environments well -- the learning curve after dealing with python installation and conda/pip setup and environment variables was exhausting enough. Gave up multiple times or only used them when working through step wise workshops.
If anyone can recommend a good learning resource - would love to take a stab again.
The only thing it does, it makes bad things happen faster. Who cares...
Basically it handles the virtual environments for you so you don't have to deal with their nonsense.
But you're right it doesn't fix it in the same way that Deno did.
Well yeah maybe if the PSF were able to get their shit together it wouldn't have taken a single third party vendor to do it for them. But they weren't and it did, so here we are.
I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.
But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.
Another thing to consider in long term is how astral tooling would change when they will need to make money.
uv will defer to any python it finds in PATH as long as it satisfies your version requirements (if any):
https://docs.astral.sh/uv/concepts/python-versions/
It also respects any virtual environment you've already created, so you can also do something like this:
/usr/bin/python3 -m venv .venv
.venv/bin/pip install uv
.venv/bin/uv install -r requirements.txt # or
.venv/bin/uv run script ...
It's a very flexible and well thought out tool and somehow it manages to do what I think it ought to do. I rarely need to go to its documentation.> Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD.
I found it very straightforward to switch to uv. It accommodates most existing workflows.
I get that uv does both, but I'm very happy with pyenv+poetry combo.
Old baggage, but I came from the rvm world which attempted to do exactly what uv does, but rvm was an absolute mess in 2013. rbenv+bundler solved so many problems for me and the experience was so bad that when I saw uv my gut reaction was to say "never again".
But this thread has so many praises for it so one day maybe i'll give it a try.
IIRC, uv downloads dynamically linked builds of Python, which may or may not work depending on your distribution and whether linked libraries are locally available or not. Not sure if things have changed in recent times.
E.g.:
#!/usr/bin/env python3
# /// script
# requires-python = ">=3.11"
# dependencies = [
# "psycopg2-binary",
# "pyyaml",
# ]
# ///
Then - uv run -s file.py
import os
print(os.environ['VIRTUAL_ENV'] + '/bin/python')
Then, e.g. in VS Code, you bring up the command palette, run Python: Select Interpreter, and enter the result. #!/usr/bin/env uv run --script
> Using uv as your shebang line
— https://news.ycombinator.com/item?id=42855258
Since `env` doesn’t pass multiple arguments by default, the suggested line uses `-S`:
#!/usr/bin/env -S uv run --script
#!/home/tetha/Tools/uv run
#!/usr/bin/env uv run
In what I’ve done, I’ve never found things like pipenv, let alone uv, to be necessary. Am I missing something? What would uv get?
The selling point of uv is that it does things faster than the tools it aims to replace, but on a conceptual level it doesn't add anything substantially new. The tools it aims to replace were borne of the defects in Python import and packaging systems (something that Anaconda also tried to address, but failed). They are not good tools designed to do things the right way. They are band-aids designed to mitigate some of the more common problems stemming from the bad design choices in the imports and packaging systems.
My personal problem with tools like uv is that, just like Web browsers in the early days of the Web tried to win users by tolerating the mistakes made by the Web site authors, it allows to delay the solution of the essential problems that exist in Python infrastructure by offering some pain relief to those who are using the band-aid tools.
Uv makes python go from "batteries included" to "attached to a nuclear reactor"
It replaces a whole stack, and does each feature better, faster, with fewer modes of failure.
It sounds like uv should replace the combination. Of course there is the risk of this being another case of the python community ritually moving the problem every few years without properly solving it. But it sounds like uv is mostly doing the right thing; which is making global package installation the exception rather than the default. Most stuff you install should be for the project only unless you tell it otherwise.
Will give this a try next time I need to do some python stuff.
We use poetry at work, but getting it to play nice with PyTorch is always a bit of an art. I tried to get into Pixi, but have been a little annoyed as it seems to have inherited conda's issues when mixing conda and PyPi.
uv so far has been relatively smooth sailing, and they even have an entire section on using it with PyTorch: https://docs.astral.sh/uv/guides/integration/pytorch/
I also use mise with it, which is a great combination and gives you automatic venv activation among other things.
See, among other mise docs related to Python, https://mise.jdx.dev/mise-cookbook/python.html
See also a Python project template I maintain built on mise + uv: https://github.com/level12/coppy
I think the current status quo, that of mise utilizing uv for it's Python integration support, makes sense and I don't see that changing.
Also, FWIW, mise has other methods for Python integration support, e.g. pyenv, virtualenv, etc.
Edit:
Ha... Didn't realize who I was replying to. Don't need me to tell you anything about mise. I apparently misinterpreted your comment.
and btw mise's venv support isn't going anywhere probably ever, but I do hope that at some point we could either let uv do the heavy lifting internally or point users to uv as a better solution
In particular, we use flask-vite and it's so nice to be able to have the right version of Node specified in the same management system as we specify the Python version. This solved a not insignificant amount of angst around FE development for me personally since I spend most of my time in the BE.
It's not like it was insurmountable before. But now, with mise, it's in that "just works" category for me.
The PSF would probably dig up some old posts from the uv authors, defame them, take the code and make it worse.
I have been pretty pleased with uv, but I am continually worried about the funding model. What happens when the VC starts demanding a return?
I'd bet that the sort of person who is maintaining packaging for a bunch of Python users would like an opportunity to learn Rust on the job. I would.
https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...
You can use an alternate shebang line so you can run the script directly:
#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "requests",
# "typer-slim",
# ]
# ///
import requests
import typer
# ...
Uv takes the position that since it’s so fast to install dependencies and create environments, you don’t maintain a global venv.
uvx ruff file.py
Will setup a venv, install ruff into it, and run over your file. https://docs.astral.sh/uv/guides/tools/Otherwise you can:
uv run —-with package file.py
If you don’t want to declare your dependencies.https://docs.astral.sh/uv/guides/scripts/#running-a-script-w...
Or better, do the above, then create a virtual env, set the virtual env in your.bashrc and install everything into that
Better still... use uv script --init See other comments on this post
alias newpy='function _newpy() {
default_deps=("rich" "requests");
uv init --script "$1" && uv add "${default_deps[@]}" --script "$1";
}; _newpy'
Then all the python dependencies are managed with uv.
For a non-python project which needs a python-based CLI tool, i’m not sure if i’d use mise or uv (uvx).
Right now, the only thing I really want is dependency pinning in wheels but not pyproject.yaml, so I can pip install the source and get the latest and greatest, or I can pip install a wheel and get the frozen dependencies I used to build the wheel. Right now, if I want the second case, I have to publish the requirements.txt file and add the wheel to it, which works but is kind of awkward.
I don't need to be told to RTFM. I was asking for advice. My attention span is my most valuable commodity, and since I'm not really surprised or slowed down by setuptools, etc., it sounds like uv probably isn't worth investigating.
Thanks.
To answer my own question—and to actually help other people with similar use cases—I read about uv's build process and dependency locking. It does not appear to be able to lock dependencies for build distributions (wheels).
https://docs.astral.sh/uv/concepts/projects/build/
https://docs.astral.sh/uv/pip/compile/
However, it does mention that Python supports multiple build systems, which I didn't know, so this hasn't been a complete waste of my time.
Thanks!
I'm very comfortable with pyenv, but am extremely open to new stuff
"Course was worth it just for uv"
If you are comfortable with `pyenv`, the switch to `uv` is basically a walk in the park. The benefit is the speed + the predictable dependencies resolution.
And am currently trying to move current work to UV. The problems seem to be possibility of unknown breakage for unknown users of the old project not any known technical issue.
I'd highly reccomend UV. Its just easier/more flexible. And it downloads third party pre compiled python builds instead of the extra time and complexity to get it compiling locally. Its much nicer especially when maintaing an environment for a team that just works without them having to know about it
One downside of UV is that unlike pyenv and rye it doesn't shim python. Pyenv shim did give me some trouble but rye simples shim didn't. The workaround is to run stuff with uv run x.py instead of python x.py
IMO no really hard problem is ever truly solved but as can be seen in other comments, this group of people really crushed the pain of me and *many* others, so bravo alone on that - you have truly done humanity a service.
I've tried almost every Python packaging solution under the sun in the past 15 years but they all had their problems. Finally I just stuck with pip/pip-tools and plain venv's but strung together with a complicated Makefile to optimize the entire workflow for iteration speed (rebuilding .txt files when .in requirements changes, rebuilding venv if requirements change, etc). I've been able to reduce it to basically one Make target calling uv to do it all.
At this point, just thinking about updating CIBuildWheel images triggers PTSD—the GitHub CI pipelines become unbearably slow, even for raw CPython bindings that don’t require LibC or PyBind11. It’s especially frustrating because Python is arguably the ultimate glue language for native libraries. If Astral’s tooling could streamline this part of the workflow, I think we’d see a significant boost in the pace of both development & adoption for native and hardware-accelerated tools.
I'd like to encourage you to blog about it, then.
I'm too used to type virtualenvwrapper's `workon` and `mkvirtualenv` commands, so I've written some lightweight replacement scripts of virtualenvwrapper when using uv. Supports tab completion and implements only the core functionality of virtualenvwrapper:
FWIW, I was able to confirm that the listed primary dependencies account for everything in the `pip freeze` list. (Initially, `userpath` and `pyrsistent` were missing, but they appeared after pinning back the versions of other dependencies. The only project for which I couldn't get a wheel was `python-hglib`, which turned out to be pure Python with a relatively straightforward `setup.py`.)
Has anyone used both hatch and uv, and could comment on that comparison?
EDIT: quick google gives me these opinions[1]
[1]: https://www.reddit.com/r/Python/comments/1gaz3tm/hatch_or_uv...
I can set up some benchmarks comparing to pyenv on a couple common platforms – lately I've just been focused on benchmarking changes to CPython itself.
If performance is important to you, the ancient advice to profile bottlenecks and implement important parts in C where you can, still applies. Or you can try other implementation like PyPy.
In the data science world, conda/mamba was needed because of this kind of thing, but a lot of room for improvement. We basically want lockfile, incremental+fast builds, and multi-arch for these tricky deps.
I think the comparison for data work is more on conda, not poetry. afaict poetry is more about the "easier" case of pure-python, and not native areas like prebuilt platform-dependent binaries. Maybe poetry got better, but I typically see it more like a nice-to-have for local dev and rounding out the build, but not that recommended install flow for natively-aligned builds.
So still curious with folks navigating the 'harder' typical case of the pydata world, getting an improved option here is exciting!
In any case I believe uv is trying to be _the_ solution and Id be pretty surprised if your libs weren't well supported, or on the roadmap at least.
If you want you can depend on a C++ and fortran compiler at runtime and (fairly) reliably expect it to work.
I just want to create a monorepo with python that's purely for making libraries (no server / apps).
And is it normal to have a venv for each library package you're building in a uv monorepo?
There is not much to know:
- uv python install <version> if you want a particular version of python to be installed
- uv init --vcs none [--python <version>] in each directory to initialize the python project
- uv add [--dev] to add libraries to your venv
- uv run <cmd> when you want to run a command in the venv
That's it, really. Any bonus can be learned later.
Maybe you mean uv tool install ?
In that case it's something you don't need right now, uv tool is useful, but it's a bonus. It's to install 3rd party utilities outside of the project.
There is no equivalent to script yets, althought they are adding it as we speak.
uv run exec any command in the context of the venv (which is like a node_modules), you don't need to declare them prior to calling them.
e.g: uv run python will start the python shell.
Edit: I get it now. It's like npm's `npx` command.
You can even use --extra and --group with uv run like with uv sync. But in a monorepo, those are rare to use.
I looked at the group documentation, but it's not clear to me why I would want to use it, or where I would use it:
https://docs.astral.sh/uv/concepts/projects/layout/#default-...
(I'm a JS dev who has to write a set of python packages in a monorepo.)
uv run is the bread and meat of uv, it will run any command you need in the project, and ensure it will work by synching all deps and making sure your command can import stuff and call python.
In fact, if you run a python script, you should do uv run python the_script.py,
It's so common uv run the_script.py will work as a shortcut.
I will write a series of article on uv on bitecode.dev.
I will write it so that it work for non python devs as well.
Really looking forward to the articles!
Which would fit in with existing uv commands[1] like `uv add plotly`.
There is an exisiting `uv lock --upgrade-package requests` but this feels a bit verbose.
[1]: https://docs.astral.sh/uv/guides/projects/#creating-a-new-pr...
I'm still very keen on virtualenvwrapper, I hope that the fast dependency resolution and install of uv can come there and to poetry.
uv python install --preview --default 3.13
and then you get Python 3.13 whenever you run `python` outside of an environment that declares something else.I'm an end user, too. I don't have anything to do with uv development. I stumbled across it in a GitHub issue or something and passed along the info.