Uv overtakes pip in CI

186 points 156 comments 8 days ago
bognition

This shouldn’t be a surprise to anyone who has been using Python and has tried uv.

Python dependency management and environments have been a pain for 15 years. Poetry was nice but slow and sometimes difficult.

Uv is lightning fast and damn easy to use. It’s so functional and simple.

anitil

For me the most convincing argument was that it took ~3 minutes to go from 'I wonder if I should give this thing a try' to 'oh it .... it worked!?'

tclancy

Yeah, been doing this for over twenty years and finally got a chance to start playing with it a few months back and was confused at how I got that far that fast.

saghm

As someone who also hasn't really used any of the past 8 years or so of Python dependency management, it's nice that it seems to support using arbitrary other tooling as well. At some point recently I wanted to run something that happened to use pdm, which I hadn't even heard of, but I was able to invoke it with `uv tool run pdm` and not have to learn anything about how to set it up manually.

om8

FYI you can run just `uvx pdm`

ziml77

It really is!

I switched to using uv just 2 weeks ago. Previously I had been dealing with maintaining a ton of batch jobs that used: global packages (yes, sudo pip install), manually managed virtualenvs, and docker containers.

uv beats all of them easily. Automatically handling the virtualenv means running a project that uses uv feels as easy as invoking the system Python is.

Balinares

I just wish uv made it more straightforward to have arbitrary purpose-specific virtual environments, e.g. for building the package, for running the test suite, for dev tooling (PuDB ), etc. That's one thing pixi does better, I think.

hk1337

It’s a little too fast, I’m having trouble believing it’s actually doing anything sometimes.

hyperbovine

uv is so over-the-top fast compared to what we're used to that I would argue it's actually bad for the language. Suddenly it dawns on you that by far the most capable and performant package manager (and linter) (and code formatter) (and type checker) for Python is in fact not written in Python. Leaves an odd taste. Makes you wonder what else ought not be written in Python ... or why anything should be written in Python. Here be dragons ...

tyg13

IMO, Python should only be used for what it was intended for: as a scripting language. I tend to use it as a kind of middle ground between shell scripting and compiled languages like Rust or C. It's a truly phenomenal language for gluing together random libraries and data formats, and whenever I have some one-off task where I need to request some data from some REST API, build a mapping from the response, categorize it, write the results as JSON, then push some result to another API -- I reach for Python.

But as soon as I have any suspicion that the task is going to perform any non-trivial computation, or when I notice the structure of the program starts to grow beyond a couple of files, that's when Python no longer feels suitable to the task.

rcleveng

I'd rather write python than rust personally, and I also don't mind if anything performance critical is written in rust.

Letting each language do what it does best is really ideal, I'm glad that python has a great FFI, I wish golang did.

Speaking of golang, the typescript compiler is now written in it. Another case of using each language for it's strengths.

slyall

Well Python isn't written in python either.

ipnon

And it's killer use case today that is artificial intelligence development really uses it as a glue language for CUDA and similar GPU APIs.

arcanemachiner

Try not to cut yourself while grinding that axe.

Python may not be the fastest language, but it's easy to learn, compilation times aren't an issue, you'll never have to fight the borrow checker, etc.

Every language has its warts.

pjmlp

I never got why.

I use Python since version 1.6, mainly for OS scripting, because I rather use something with JIT/AOT in the box for application software.

Still, having a little setup script to change environment variables for PYTHONPATH, PATH and a few other things, always did the trick.

Never got to spend hours tracking down problems caused by the multiple solutions that are supposed to solve Python's problems.

ThibWeb

for me the surprise is the pace? I’d expect people to be more set in their tools that it takes longer than a few months for a new tool, no matter how good, to become the majority use one. Though perhaps people adopt new tools more easily in CI where install times matter more

perrygeo

The pace of uv adoption is insanely fast. It's directly related to how bad the previous Python tools were/are. Even to seasoned veterans set in their ways - they still know a better solution when they see it.

rtpg

uv having a good pip compatibility layer probably helped a lot, because you could try things out that way and see what fit, so to speak.

It's probably worth mentioning that Astral (The team behind uv/etc) has a team filled with people with a history of making very good CLI tooling. They probably have a very good sense for what matters in this stuff, and are thus avoiding a lot of pain.

Motivation is not enough, there's also a skill factor. And being multiple people working on it "full time"-ish means you can get so much done, especially before the backwards compat issues really start falling into place

scuff3d

uv was really smart in the way they integrated with existing solutions. My whole team just switched over from pip, and it was painless. We were already using pyproject.toml files which made it even easier, but uv also has documentation for transitioning from requirements.txt files.

simonw

uv first came out 15th February 2024 so it's a year and a half old now. Still pretty impressive for it to get adoption this fast though.

lukeschlather

I feel like I've tried at least 5 different package management tools for python. Between pip, poetry, pip-tools, pipx, I'm not really sure what easy_install, egg, pkg_info are, but I do know I have always been surprised I need to care.

It sounds like uv is a drop-in replacement for pip, pipx, and poetry with all of their benefits and none of the downsides, so I don't see why I wouldn't migrate to it overnight.

skylurk

It's a (better IMO) replacement for poetry, but not drop-in. Additionally it is a drop-in replacement for venv and pip-tools.

bognition

Honestly, I was skeptical when I learned about uv. I thought, just Python needs, another dependency manager… this was after fighting with pip, venv, venvwrapper, and poetry for years.

Then I gave it a try and it just worked! It’s so much better that I immediately moved all my Python projects to it.

zahlman

> I thought, just Python needs, another dependency manager… this was after fighting with pip, venv, venvwrapper, and poetry for years.

Pip, venv and virtualenvwrapper (people still use this?) are not meaningfully "dependency managers". A venv is just a place to put things, and pip does only basic tracking and tries to maintain a consistent environment. It isn't trying to help you figure out what dependencies you need, create new environments from scratch, update pyproject.toml....

Pip's core capability is the actual installation of packages, and uv does a far better job of that part, using smarter caching, hard links to share files, parallelized pre-compilation of .pyc files, etc. Basically it's designed from the ground up with the intention to make lots of environments and expect starting a new one to be cheap. Poetry, as far as I was able to determine, does it basically the same way as pip.

sgarland

I actually did use virtualenvwrapper quite a bit until uv. I had built up various shell aliases and functions, so it was fairly painless to create and manage venvs. uv just means that I don’t have to think about that part now.

WD-42

I think it’s been long enough now. Uv just has so much velocity. Pyproject.toml and pep support just keeps getting better.

Poetry which I think is the closest analogue, still requires a [tool.poetry.depenencies] section afaik.

greenavocado

You don't even need to edit any files yourself for most simple use cases.

    uv init
    uv add package
    uv run program.py
That's it.

If you inherit a codebase made this way from someone else, merely running uv run program.py will automatically create, launch the venv, configure packages, run your script, seamlessly on first launch.

Uv lets you almost forget virtual environments exist. Almost.

kstrauser

Yep. Poetry was such a delightful upgrade from pipenv, which we’d tested as an upgrade from bare pip, which didn’t have a dependency resolver at the time. If someone’s already fully bought in on poetry, that’d be the one case where I could plausibly imagine them wanting to leave well enough alone.

For everyone else, just try uv and don’t look back.

walkabout

Is this like when everyone on here had already been saying Yarn was a no-brainer replacement for npm, having totally obsoleted it, for like two-plus years, but it was still lacking safety/sanity checks, missing features, and broke in bizarre ways on lots of packages in-the-wild?

Or is the superior replacement actually up to the job this time?

dmd

It really is just that good. That is why it's had such massive uptake. No matter how many times you've been burned before, no matter how skeptical you are, it's so good that, seriously, just try it, and you'll be instantly converted.

walkabout

Ok, sold, I’ll try it.

kstrauser

I’m certain there’s going to be so bizarre edge case where pip is fine and uv isn’t. It’s inevitable. However, in every situation where I’ve used it, pip is better than pip or poetry or any other package manager I’ve ever used.

I just found out they’re still making pipenv. Yes, if you’re using pipenv, I’m confident that uv will be a better experience in every way, except maybe “I like using pipenv so I can take long coffee breaks every time I run it”.

walkabout

Yeah, I’m just skeptical because I was at an agency in the heat of yarn-mania, waaaay after people online were proclaiming npm dead and pointless, and it went poorly enough that we developed a ha-ha-only-serious joke that you knew a project was properly in-development when someone had lost a half-day debugging some really weird error only to find that “npm install” instantly fixed it, and then switched the started-in-yarn codebase over to npm.

kstrauser

I could see that being traumatizing, but this really isn’t like that. Pip and uv and poetry and the rest don’t fundamentally change how a package is installed into a Python virtualenv. If `uv add foo` works, you could use the equivalent in any of those other tools and get basically the same result. You don’t have to know or care which tool is installing your project because that’s all invisible from inside the code you write.

daemonologist

There is! The company I work for uses a weird version of Azure Devops for <governance> reason, and pip can authenticate and install packages from its artifact feeds while uv cannot. We use uv for development speed (installing internal packages from source) and then switch to pip for production builds.

andy99

I’ll bite - I could care less about speed, that feels like a talking point I see often repeated despite other package managers not being particularly slow. Maybe there’s some workload I’m missing that this is more important for?

I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason. I know thats anecdotal and I’m sure it mostly works, but it obviously was off putting. For better or worse I know how to use conda, and despite having to special attachment to it, slightly faster with a whole different set of rough edges is not at all compelling.

I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

I’d like to hear a real reason I would ever migrate to it, and honestly if there isn’t one, am super annoyed about having it forced on me.

simonw

The place where speed really matters is in virtual environment management.

uv uses some very neat tricks involving hard links such that if you start a new uv-managed virtual environment and install packages into it that you've used previously, the packages are symlinked in. This means the new environment becomes usable almost instantly and you don't end up wasting filesystem space on a bunch of duplicate files.

This means it's no longer expensive to have dozens, hundreds or even thousands of environments on a machine. This is fantastic for people like myself who work on a lot of different projects at once.

Then you can use "uv run" to run Python code in a brand new temporary environment that get created on-demand within ms of you launching it.

I wrote a Bash script the other day that lets me do this in any Python project directory that includes a setup.py or pyproject.toml file:

  uv-test -p 3.11
That will run pytest with Python 3.11 (or 3.12/3.13/3.14/whatever version you like) against the current project, in a fresh isolated environment, without any risk of conflicting with anything else. And it's fast - the overhead of that environment setup is negligible.

Which means I can test any code I like against different Python versions without any extra steps.

https://til.simonwillison.net/python/uv-tests

Alir3z4

Ooooh that's a neat one. I really like the hard links.

On my machine, there are like 100s of not thousands of venvs.

I simply have all of them under ~/.python_venvs/<project_name>/

Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?

If that's the case, then I can leave my habit of pip and move to uv.

This is something that always bugged my mind about virtual environments in almost all the package managers.

simonw

"Does that mean, no matter how many projects I install pytorch and tensoflow and huggingface and all the heavy machinery, they'll be counted only once as long as they're unique?"

I think so, based on my understanding of how this all works. You may end up with different copies for different Python versions, but it should still save you a ton of space.

eslaught

Conda doesn't do lock files. If you look into it, the best you can do is freeze your entire environment. Aside from this being an entirely manual process, and thus having all the issues that manual processes bring, this comes with a few issues:

1. If you edit any dependency, you resolve the environment from scratch. There is no way to update just one dependency.

2. Conda "lock" files are just the hashes of the all the packages you happened to get, and that means they're non-portable. If you move from x86 to ARM, or Mac to Linux, or CPU to GPU, you have to throw everything out and resolve.

Point (2) has an additional hidden cost: unless you go massively out of your way, all your platforms can end up on different versions. That's because solving every environment is a manual process and it's unlikely you're taking the time to run through 6+ different options all at once. So if different users solve the environments on different days from the same human-readable environment file, there's no reason to expect them to be in sync. They'll slowly diverge over time and you'll start to see breakage because the versions diverge.

P.S. if you do want a "uv for Conda packages", see Pixi [1], which has a lot of the benefits of uv (e.g., lock files) but works out of the box with Conda's package ecosystem.

[1]: https://pixi.sh/latest/

gre

You've never waited 10 minutes for conda to solve your environment and then say it's unsolvable?

andy99

I have, but it takes me back many years to some obscure situations I’ve been in. For my day to day, I can’t think of the last time I’ve encountered it, it’s been years, and I regularly am setting up new environments. That’s why I’m curious about the workflows where it matters.

gre

I dunno, 2020? Since then I switched to mamba, then poetry, and now uv. I have spent way too much time fighting python's package managers and with uv I finally don't have to. ymmv

zbentley

> I have a feeling this is some kind of Rust fan thing and that’s where the push comes from, to try and insinuate it into more people’s workflows.

When I first started using uv, I did not know what language it was written in; it was a good tool which worked far better than its predecessors (and I used pdm/pipenv/pyenv/etc. pretty heavily and in non-basic ways). I still don’t particularly care if it’s written in Rust or Brainfuck, it works well. Rust is just a way to get to “don’t bootstrap Python environments in Python or shell”.

> I’ve tried uv a couple places where it’s been forced on me, and it didn’t work for whatever reason.

I’m curious what issues you encountered. Were these bugs/failures of uv, issues using it in a specific environment, or workflow patterns that it didn’t support? Or something else entirely?

cgearhart

I’ve been trying uv lately to replace my normal workflow of selecting a python with pyenv for the shell, then making a venv, then installing a bunch of default packages (pandas, Jupyter, etc). So far the only benefit is that I can use just the one tool for what used to take 3 (pyenv, venv, pip). I don’t _hate_ it…but it really isn’t much of an improvement.

morshu9001

uv is comparable to npm. All your deps get auto tracked in a file. There are other things that do this, but pip isn't one of them, and I vaguely remember the others being less convenient.

The speed usually doesn't matter, but one time I did have to use it to auto figure out compatible deps in a preexisting project because the pip equivalent with backtracking was taking forever with CPU pegged at 100.

markkitti

What tooling do you use?

testdelacc1

Everyone downvoting you and disagreeing - don’t listen to them! I’m here to tell you that there is a massive conspiracy and everyone is in on it. Commenters on HN get paid every time someone downloads a Rust tool, that’s why they’re trying to convince you to use uv. It’s definitely not because they used it and found it worked well for them.

> could care less

I think “couldn’t care less” works better.

fragmede

Being forced to use a tool you don't want to use sucks, no matter how awesome that tool may or may not actually be. *conda and uv have roughly the same goals which means they're quite similar. For me, the speed of uv really does set it apart. For python programs with lots of dependencies, it's faster enough that I found it worth it to climb its learning curve. (ChatGPT makes that curve rather flat.) pip install -r requirements.txt went from a coffee break to me watching uv create the venv. But okay, speed gains aren't going to convince you.

Both of them manage venvs, but where the venv goes (by default) makes a difference, imo. Conda defaults to a user level directory eg ~/.conda/envs/my-venv. uv prefers a .venv dir in the project's folder. It's small, but it means per-project venvs are slightly more ergonomic with uv. Wereas with conda, because they're shared under homedir, it's easy to get lazy once you have a working venv and reuse that good working venv across multiple programs, and then it breaks when one program needs its dependencies updated and now it's broken for all of them. Naturally that would never happen to a skilled conda operator, so I'll just say per-project uv venv creation and recreation flows just that tiny bit smoother, because I can just run "rm -rf .venv" and not worry about breaking other things. One annoyance I have with uv is that it really wants to use the latest version of python it knows about, and that version is too new for a program or one of its dependencies, and the program won't run. Running "uv venv --python 3.12" instead of"uv venv" isn't onerous, but it's annoying enough to mention. (pyproject.toml lets projects specify version requirements, but they're not always right.) Arguably that's a python issue and not uv's, but as users, we just want things to work, dammit. That's always the first thing I look for when things don't work.

As mentioned, with uv the project venv lives in .venv inside the project's directory which lets "uv run program.py" cheat. Who amongst us hasn't forgotten to "source .venv/bin/activate" and been confused when things "suddenly" stopped working. So if you're in the project directory, "uv run" will automatically use the project's .venv dir.

As far as it being pushed to promote rust. I'm sure there's a non-zero amount of people for whom that's true, but personally as that makes it harder to contribute to uv, it's actually a point against it. Sometimes I wonder how fast it would be if it was written in python using the same algorithms, but run under pypy.

Anyway, I wouldn't say any of that's revolutionary. Programs exist to translate between the different project file types (requirements.txt/environment.yml/pyproject.toml) so if you're already comfortable with conda and don't want to use uv, and you're not administering any shared system(s), I'd just stick the command to generate environment.yml from pyproject.toml on a cheat sheet somewhere.

---

One bug I ran into with one of the condas; I forgot which, is that it called out to pip under the hood in interactive mode and pip got stuck waiting for user input and that conda just sat there waiting for input that would never come. Forums were filled with reports by users talking about letting it run for hours or even days. I fixed that, but it soured me on *conda, unfortunately.

Balinares

Tangential: if you're stuck in the condaverse I would *loudly* recommend checking out pixi. Pixi is to conda as uv is to setuptools.

skylurk

Does uv actually have a replacement for setuptools yet?

WhyNotHugo

uv is weird. It's like 5 entirely different tools mashed and entangled into one program.

Last I tried it, it insisted on downloading a dynamically linked Python and installing that. This obviously doesn't work, you can't distribute dynamically linked binaries for Linux and expect them to work on any distribution (I keep seeing this pattern and I guess it's because this typically works on macOS?).

Moreover my distribution already has a package manager which can install Python. I get that some absolute niche cases might need this functionality, but that should most definitely be a separate tool. The problem isn't just that the functionality is in the same binary, but also that it can get triggered when you're using another of its functionalities.

I wish this had been made into actual separate tools, where the useful ones can be adopted and the others ignored. And, most important, where the ecosystem can iterate on a single tool. Having "one tool that does 5 things" makes it really hard to iterate on a new tools that does only one of those things in a better way.

It's pretty disappointing to see the Python ecosystem move in this direction.

Balinares

Your distro's package manager cannot install arbitrary versions of Python such as might be required by a specific Python project and it cannot install anything at all for individual users without root access. These are two different tools that serve two different purposes.

atoav

Started converting every repo over to uv. I had some weird and hard to deal with dependencies before. Every single one was easier to solve than before. It just works and is blazingly fast.

Absolute no-brainer.

kace91

As an outsider to python, I never got how a language who got popular for being simple, elegant and readable could end up with perhaps the most complex tooling situation (dependencies, envs, etc). Any time I glance at the community there seems to be a new way of doing things.

What caused python to go through these issues? Is there any fundamental design flaw ?

simonw

It's mostly about age. Python has been around for 35 years now. The first version of a Python package directory was the cheeseshop (Monthy Python reference) in 2003. The earliest version of a pip-like tool was "easy_install" which - I kid you not - worked by scraping the HTML listing page of the cheeseshop and downloading zip files linked from that!

More recent languages like Node.js and Rust and Go all got to create their packaging ecosystems learning from the experiences of Perl and Python before them.

There is one part of Python that I consider a design flaw when it comes to packaging: the sys.modules global dictionary means it's not at all easy in Python to install two versions of the same package at the same time. This makes it really tricky if you have dependency A and dependency B both of which themselves require different versions of dependency C.

falcor84

On a tangent, the somewhat related issue of Python 3 not being able to import Python 2 packages famously led Zed Shaw of "Learn Python the Hard Way" to write a rant about how Python is not Turing Complete. I checked again and apparently he removed that rant and only has a disclaimer in its place mentioning that he was obviously being hyperbolic [0].

[0] https://learnpythonthehardway.org/book/nopython3.html#the-py...

testdelacc1

I don’t think anyone takes Zed Shaw seriously.

He’s like that uncle you see at family gatherings whom you nod along politely to.

zahlman

Zed Shaw seems to have some very interesting beliefs about the 2->3 migration in general. I think it's fair to call some of it conspiratorial.

falcor84

Indeed that was a weird time, but he did eventually relent and release a version for Python 3 - https://learncodethehardway.com/client/#/product/learn-pytho...

tclancy

Man, there was a window there where I still fell back to easy_install on Windows because it would handle C based stuff more reliably until wheels got invented. It’s been a journey.

lelandbatey

I think it's also from trying to keep with the old paradigm of "libraries are installed and managed globally, potentially as linkable object files."

All the languages of today gain all their improvements from:

1. Nothing should be global, but if it is it's only a cache (and caches are safe to delete since they're only used as a performance optimization)

2. You have to have extremely explicit artifact versioning, which means everything needs checksums, which means mostly reproducible builds

3. The "blessed way" is to distribute the source (or a mostly-source dist) and compile things in; the happy path is not distributing pre-computed binaries

Now, everything I just said above is also wrong in many aspects or there's support for breaking any and all of the rules I just outlined, but in general, everything's built to adhere to those 3 rules nowadays. And what's crazy is that for many decades, those three rules above were considered absolutely impossible, or anti-patterns, or annoying, or a waste, etc (not without reason, but still we couldn't do it). That's what made package managers and package management so awful. That's why it was even possible to break things with `sudo pip install` vs `apt install`.

Now that we've abandoned the old ways in e.g. JS/Rust/Go and adopted the three rules, all kinds of delightful side effects fall out. Tools now which re-build a full dependency tree on-disk in the project directory are the norm (it's done automatically! No annoying bits! No special flags! No manual venv!). Getting serious about checksums for artifacts means we can do proper versioning, which means we can do aggressive caching of dependencies across different projects safely, which means we don't have to _actually_ have 20 copies of every dependency, one for each repo. It all comes from the slow distributed Gentoo/FreeBSD-ification of everything and it's great!

mikepurvis

Python 2.0 was released in October 2000. The Python ecosystem has witnessed several significant shifts in expectation as far as how software is built and delivered, from Slackware-style source builds to vendor packages to containers to uv just downloading a standalone binary archive. And the deadsnakes ppa and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.

All of this alongside the rise of GitHub and free CI builders, it being trivial to depend on lots of other packages of unknown provenance, stdlib packages being completely sidelined by stuff like requests.

It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend, what a lock file is and how workspace management fits into the whole picture. Distutils and setuptools are in there too.

Basically, Python’s packaging has been a mess for a long time, but uv getting almost everything right all of a sudden isn’t an accident; it’s an abrupt gelling of ideas that have been in progress for two decades.

zahlman

> the deadsnakes ppa

Please don't use this. You need to be careful about how you place any secondary installation of Python on Ubuntu. Meanwhile, it's easy to build from source on Ubuntu and you can easily control its destination this way (by setting a prefix when you ./configure, and using make altinstall) and keep it out of Apt's way.

> and venvs, plus the ongoing awkwardness about whether pip should be writing stuff into usr/local or ~/.local or something else.

There is not really anything like this. You just use venvs now, which should have already been the rule since 3.3. If you need to put the package in the system environment, use an Apt package for that. If there isn't an Apt package for what you want, it shouldn't live in the system environment and also shouldn't live in your "user" site-packages — because that can still cause problems for system tools written in Python, including Apt.

You only need to think about venvs as the destination, and venvs are easy to understand (and are also fundamental to how uv works). Start with https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... .

> It’s really only in the last ten years or so that there’s been the clarity of what is a build backend vs frontend

Well no; it's in that time that the idea of separating a backend and frontend emerged. Before that, it was assumed that Setuptools could just do everything. But it really couldn't, and it also led to people distributing source packages for pure-Python projects, resulting in installation doing a ton of ultimately useless work. And now that Setuptools is supposed to be focused on providing a build backend, it's mostly dead code in that workflow, but they still can't get rid of it for backwards compatibility reasons.

(Incidentally, uv's provided backend only supports pure Python — they're currently recommending heavyweight tools like maturin and scikit-build-core if you need to compile something. Although in principle you can use Setuptools if you want.)

rtpg

> Meanwhile, it's easy to build from source on Ubuntu and you can easily control its destination this way

word of warning: I spent a lot of years working off of "built from source" Python on Ubuntu and every once in a while I'd have really awkward issues downstream of me not realizing I was missing some lib when I built Python and then some random standard library was just missing for me.

I think it's all generally good, but real easy to miss optional package stuff.

mikepurvis

> You just … now

Yes, the point of my post wasn’t to give current best practice counsel but rather to illustrate how much that counsel has changed over the years as the needs and desires of the maintainers, distro people, developers, and broader community have evolved.

WD-42

If you read the initial bbs post by Guido introducing Python he describes it mostly as an alternative to bash. Basically a really nice scripting language with a decent standard library. I don’t think it was designed from the start to end up where it has. He created a genius syntax that people love.

davesque

Dependency management has always felt complicated. However, environment management I think is actually way simpler than people realize. Python basically just walks up directories trying to find its packages dir. A python "env" is just a copy of the python binary in its own directory. That's pretty much it. Basically all difficulties I've ever had with Python environments have been straightened out by going back to that basic understanding. I feel like the narrative about virtualenvs has always seemed scary but the reality really isn't.

zahlman

1. Age; there are absurd amounts of legacy cruft. Every time you have a better idea about how to do things, you have to agonize over whether you'll be allowed to remove the old way. And then using the old ways ends up indirectly causing problems for people using the new ways.

2. There is tons of code in the Python ecosystem not written in Python. One of the most popular packages, NumPy, depends on dozens of megabytes of statically compiled C and Fortran code.

3. Age again; things were designed in an era before the modern conception of a "software ecosystem", so there was nobody imagining that one day you'd be automatically fetching all the transitive dependencies and trying to build them locally, perhaps using build systems that you'd also fetch automatically.

4. GvR didn't seem to appreciate the problem fully in the early 2010s, which is where Conda came from.

5. Age again. Old designs overlooked some security issues and bootstrapping issues (this ties into all the previous points); in particular, it was (and still is) accepted that because you can include code in any language and all sorts of weird build processes, the "build the package locally" machinery needs to run arbitrary code. But that same system was then considered acceptable for pure-Python packages for many years, and the arbitrary code was even used to define metadata. And in that code, you were expected to be able to use some functionality provided by a build system written in Python, e.g. in order to locate and operate a compiler. Which then caused bootstrapping problems, because you couldn't assume that your users had a compatible version of the main build system (Setuptools) installed, and it had to be installed in the same environment as the target for package installation. So you also didn't get build isolation, etc. It was a giant mess.

5a. So they invented a system (using pyproject.toml) that would address all those problems, and also allow for competition from other build back-ends. But the other build back-end authors mostly wanted to make all-in-one tools (like Poetry, and now, er, uv); and meanwhile it was important to keep compatibility, so a bunch of defaults were chosen that enabled legacy behaviour — and ended up giving old packages little to no reason to fix anything. Oh, and also they released the specification for the "choose the build back-end system" and "here's how installers and build back-ends communicate" years before the specification for "human-friendly input for the package metadata system".

cgearhart

BDFL left a long time ago. It’s not opinionated anymore. The language went from being small enough to fit in that guy’s head to a language controlled by committee that’s trying to please everyone.

swyx

poor answer. guido had very little impact on the packaging mess.

zahlman

Many would say that's the problem; i.e. that he should have had more impact. Check out the history of Conda.

swyx

right but "BDFL left" is clearly the wrong thing to blame when "BDFL never cared enough" so it doesnt matter if he left

kstrauser

That’s right. And we switched from eggs to wheel’s on Guido’s watch, but that was from him being a good leader and letting other smart people do clever things on their own.

morshu9001

It was an intentional design decision to separate package installation and management. I think that created the mess we have now.

Funny thing is that decision was for modularity, but uv didn't even reuse pip.

zahlman

> Funny thing is that decision was for modularity, but uv didn't even reuse pip.

To be fair, that's justified by pip's overall lack of good design. Which in turn is justified by its long, organic development (I'm not trying to slight the maintainers here).

But I'm making modular pieces that I hope will showcase the original idea properly. Starting with an installer, PAPER, and build backend, bbbb. These work together with `build` and `twine` (already provided by PyPA) to do the important core tasks of packaging and distribution. I'm not trying to make a "project manager", but I do plan to support PEP 751 lockfiles.

pansa2

Python is not simple, it's a very complex language hiding behind friendly syntax.

Given that, plus the breadth and complexity of its ecosystem, it makes sense that its tooling is also complex.

nomel

Seems like the flaw is that it was never a first class citizen of the language.

easy_install never even made it to 1.0

Still, not bad for a bunch of mostly unpaid volunteers.

lvl155

I call it the JS-syndrome.

javchz

As many flaws as the npm/yarn/pnpm ecosystem has, its interoperability is waaaay better than the whole juggling act between pip, ven, poetry, Anaconda, Miniforge, and uv across projects.

UV it's a step in the right direction, but legacy projects without Dockerfile can be tricky to start.

morshu9001

JS did this right, in fact uv is kinda replicating the npm way. And there are other JS things I'd like Py to follow suit on.

dgfitz

People. People happened. Ideologies and strong opinions.

112233

I'm at the point where I don't touch python without uv at all, if possible. The only bad is, now I want to use uv to install go and java and debian packages too ... :(

The ability to get random github project working without messing with system is finally making python not scary to use.

icar

You might be interested in mise [0]:

mise use -g go@1.24

mise use -g java@latest

mise use -g github:BurntSushi/ripgrep

[0]: https://mise.jdx.dev/

Alir3z4

I seriously still don't know why I should use "uv". I just create my .venv and pip install.

Rarely I'd need a different version of python, in case I do, either I let the IDE to take care of it or just do pyenv.

I know there's the argument of being fast with uv, but most of the time, the actual downloading is the slowest part.

I'm not sure how big a project should be, before I feel pip is slow for me.

Currently, I have a project with around 50 direct dependencies and everything is installed in less than a min with a fresh venv and without pip cache.

Also, if I ever, ever needed lock files stuff, I use pipx. Never needed the hash of the packages the way it's done in package-lock.json.

Maybe, I'm just not the target audience of uv.

gooodvibes

> I just create my .venv and pip install.

Even if you only change your commands to 'uv venv ...' and 'uv pip install ...' and keep the rest of your workflow, you'll get

1. Much faster installs.

2. The option to specify the python version in the venv creation instead of having to manage multiple Python versions in some other way.

No pyproject.toml, no new commands to learn. It still seems like a win to me.

CaliforniaKarl

TBH I feel the same. And for development on my laptop, that seems fine. For the Python package I'm working on how, a single run of pytest takes less than five seconds.

Where things get annoying is when I push to GitHub and Tox runs through GitHub Actions. I've set up parallel runs for each Python version, but the "Prepare Tox" step (which is where Python packages are downloaded & installed) can take up to 3 minutes, where the "Run Tox" step (which is where pytest runs) takes 1½ minutes.

GitHub Actions has a much better network connection than me, but the free worker VMs are much slower. That is where I would look at making a change, continuing to use pip locally but using uv in GitHub Actions.

morshu9001

So you always have your deps' versions tracked, like in npm

atoav

A sure way to learn why it is needed would be to:

1. Write code that crosses a certain complexity treshold. Let's say tou also need compiled wheels for a performance critical section of a library that was written in Rust, have some non-public dependencies on a company-internal got server

2. Try deploying said code on a fleet of servers whose version and exact operating system versions (and python versions!) are totally out of your control. Bonus points for when your users need to install it themselves

3. Wait for the people to contact you

4. Now do monthly updates on their servers while updating dependencies for your python program

If that was never your situation, congrats on your luck, but that just means you really weren't in a situation where the strengths of uv had played out. I had to wrestle with this for years.

This is where uv shines. Install uv, run with uv. Everything else just works, including getting the correct python binary, downloading the correct wheel, downloading dependencies from the non-public git repo (provided the access has been given), ensuring the updates go fine, etc.

wzdd

This explains a lot for me. On the server side, all my for-pay stuff is deployed using Docker. We have a single Python environment and complete control over it. We do multistage for compilation.

Client side, we don't get the privilege of deploying code: we need to build installers, which means again we have complete control over the environment because we package Python and all associated dependencies.

I'm sure there are marginal benefits to uv even with the above scenarios (better dependency management for example), but it seems that there's a middle ground here which I have largely avoided which is where uv really shines.

atoav

Yeah makes sense, with docker in the mix the things uv brings are less interesting, although using uv for small one-off scripts is also an interesting application (there is a way of making uv your shebang, declaring sependencies within the python file and essentially getting a uv-ran python script that will auto-download the needed dependencies).

Over the years I encountered many situations where other solutions (pip+pyenv, poetry, easy_install) lead to hour long stops in dependency hell. Meanwhile uv just works. Even the most complicated projects I transfered over since I decided to make the switch worked first try.

I am not the person who has to go for the newest shiniest thing just because, but if that new shiny thing does the job instead of wasting my time sign me up.

andy99

If it sold itself on its merits I don’t think we’d see all these fawning posts about it. It’s a Rust fan thing. You can see how any criticism gets treated. I’m sure it works for some people and obviously if it does, then great. But it’s got this same weird cult following and pretend talk of speed that lots of Rust stuff has. It’s getting a little tiring. If you like it, use it, evangelizing is obnoxious.

CaliforniaKarl

I'm not a Rustacean, but I'll tell you what merited me installing uv (via its MacPorts package) last week.

I had decided to do something via a one-off Python script. I wanted to use some Python packages for the script (like `progressbar2`). I decided to use Inline Script Metadata[0], so I could include the package dependencies at the top of the script.

I'm not using pipenv or poetry right now, and decided to give uv a try for this. So I did a `sudo port install uv`, followed by a `uv run myscript.py --arguments`. It worked fine, making & managing a venv somewhere. As I developed the script, adding & changing dependencies, the `uv run …` installed things as needed.

After everything was done, cleanup was via `uv cache clean`.

Will I immediately go and replace everything with uv? No. As I mentioned in another post, I'll probably next look at using uv in my CI runs. But I don't feel any need to rush.

[0]: https://packaging.python.org/en/latest/specifications/inline...

saagarjha

What criticism do you have of it?

atoav

Yeah, but it sold itself on its merrits. That is the point. Maybe venv and pip works fine for some toy projects that are deployed on the developer controlled OS without regular dependency updates, but let me assure you I had hours of fights with updating python services with complex needs on Debian boxes from various ages while ensuring whst I ran as a dev is the stuff that is guaranteed to run in production.

With uv it just works and that in a fraction of the time. Where before updates would mean to mentally prepare that a thing that should take 5 seconds in the best and 15 minutes in the worst case could occupy my whole day, it has now become very predictable.

I don't care what it is written in. It works. If you think people love it because it was written in some language it just means you never had a job where what uv brings was really needed and thus you can't really judge its usefulness.

mbac32768

Also one has to chuckle at the notion that re-writing package management in Rust is some kind of fanbois with hammers looking for nails activism. Rust is almost certainly the best option for this in the 2020s, especially for a package ecosystem as deranged as Python's.

morshu9001

Could've been written in any other language and been just as good. Python needed something like this one way or another.

atoav

By this point I feel reminded of a former collegue I ate lunch with, who would repeatedly make jokes about how "vegans constantly need to talk about their veganism". During our shared time he brought that topic up probably a hundred times, while the single time veganism was brought up by anybody was by a female intern after she was asked by him why she doesn't like to try the meat.

This is what reflexive criticism of Rust starts to feel like. I get that this somehow grinds some peoples gears, but come on. Who cares what it is written in if it is good software. And as someone who tried all major ways of dependency management in Python I have to say it is the best. Don't like that it is written in Rust for ideological reasons? Go ahead and write it better in C¹ or whatever.

¹: Nothing against C, I regularily use it for embedded programming, but it appears many of the loudest Rust allergics come from there

kstrauser

What’s more obnoxious is dismissing it as a rust fanboy conspiracy. Know what I like about it? `uv install .` with a few dozen top-level dependencies takes under a second on my machine. All the tools work as documented all the time. `uv run …` is nearly instant. Those are the reasons I like it.

I couldn’t care less that it’s written in rust. It could be conjured from malbolge for all I care. It works as advertised, whatever it’s written in.

Alir3z4

I use golang, rust and c++ here and there, but majority of my time is spent working in Python projects. I'm not alien to the concept of speed and performance, especially the tooling around them.

While I like the idea of pip or uv to be insanely fast, I still don't see it revolutionize my development experience.

Installing and uninstalling package is not something I do every 1 to 10 minutes. It doesn't save me any much time. Also, activating a venv is once a session in terminal and sometime a week goes by without ever activating a venv, because the IDE does that automatically on whatever I do.

That's why, personally for me it really doesn't change much.

Where I like things being fast in my development time is pre-commit and linting, where ruff shines. Which that I also don't use, even though I work on a small-medium 600k LoC project, I only pass the changed files to isort, flake8 and black and it's all done in less than 5 seconds.

To me, the only advantage of uv is being fast, which is something I haven't been bothered with so far, where 99% of things happen in less than 1 or max couple of seconds.

kstrauser

The speed is nice. It’s not the only advantage, though. It’s so pleasant being able to `uv run [git-repo]` and having it work. The same design that makes it so fast makes it delightfully good at doing other complicated things.

atoav

Ever had customers deploy your project on 4 different debian versions without docker? Probably not, because there are problems lurking you didn't even know could exist. And 99% of them are gone with uv.

noosphr

The most unbelievable part of this story is that anyone using four versions of Debian has enough money to be customers to someone.

wtallis

You don't have to believe the parts you made up. The comment you're replying to didn't actually state that a single customer was deploying to four different Debian versions. As written, the comment only requires you to believe that four Debian versions were in use collectively across the customer base.

atoav

So you say I am lying?

noosphr

I'm saying they have much bigger problems than what python package manager they are using.

atoav

Granted, but that is relevant to the point I made in which way?

In reality you will have people running different OS versions. Maybe not within one org, but across users? For sure. If you are not using containers for one reason or another uv has shown to be a very good, reliable and easy to use way of dealing with the issues like these.

Additionally it has some other benefits when it comes to dev dependecies etc. Not that you couldn't somehow manage without it, it just makes certain things so much less pain as they were.

sunshowers

As an occasional Python user that loves uv -- I do care that it's in Rust, because Rust enforces a separation between mutable and immutable state that consistently leads to higher-quality outcomes.

kstrauser

I don’t totally not care that it’s written in Rust. That means there are whole classes of bugs it won’t have, and it’s probably rigorous about data structure and state management.

babl-yc

There's a swarm on HN that upvotes anything uv related and downvotes anything questioning it's added value as compared to pip

Alir3z4

Yes, that's a shame.

I noticed the comment from andy99 got several downvotes (became grey) and mine here also immediately got some.

zbentley

I didn’t downvote you, but the “this tool is bad and if you take the time to argue with me you’re a Rust cultist” line is a bit tiresome. Damned if you do, damned if you don’t.

It’s a bit like if anyone who said you should switch to desktop Linux got yelled at for being in the pocket of Big Systemd.

“if you like it, use it” is well and good, but haranguing people who explain why they like/use what they use is just as lame as the purported cult defense of uv or whatever tool is popular.

I dunno man, fads and stupid fixations happen in software sometimes, but most of the time hyped tools are hyped because they’re better.

forrestthewoods

Run a program should never ever require more than a single simple run command.

If your project requires creating an env and switching to shit and then running it’s a bad program and you should feel bad.

Quite frankly the fact that Python requires explaining and understanding a virtual environment is an embarrassing failure.

uv run foo.py

I never ever want running any python program to ever require more than that. And it better work first time 100%. No missing dependencies errors are ever permitted.

Also, Conda can fucking die in a fire. I wil never ever ever install conda or mini-conda onto my system ever again. Keep those abominations away.

markkitti

It sounds like there are many Python users who have acclimated to the situation of needing three or more tools to work with Python and do not see the benefit or value of being able to do this all with one potentially faster tool.

While I understand that some have acclimated well to the prior situation and see no need to change their methods, is there really no objective self-awareness that perhaps having one fast tool over many tools may be objectively better?

JoBrad

The Astral team did a great job with uv (and ruff!). I just wish they had used `install` instead of `add` and `sync`.

`uv install` = `uv sync`

`uv install rich` = `uv add rich`

saagarjha

I feel like the new terminology matches what it's doing better, though. You don't install things anymore, uv just makes the state of the world match what you asked for.

drcongo

I prefer shorter commands, so I just wish they'd gone with `uv rm` instead of `uv remove`.

adfm

Interesting to see Wagtail mentioned on HN. Anyone using it in production care to chime in on how uv improves your experience?

drcongo

We use Wagtail with uv all the time - uv is just better at every single thing it does than any other way of doing any of those things.

ai-christianson

Simple: uv run script.py just works on a clean box/CI, the lockfile keeps runs reproducible, and my CI “install deps” step is way faster now.

rednafi

I literally stopped writing Python for scripting a year ago - the distribution story was too painful. With LLMs, there's not much a dynamic language offers over something like Go even for quick scripting.

Also, on a new machine, I could never remember how to install the latest version of Python without fiddling for a while. uv solves the problem of both installation and distribution. So executing `uv run script.py` is kind of delightful now.

ThibWeb

it’s my 1st attempt at reporting on CI downloads specifically. Interpreting this is more of an art than a science, I’d love to hear if others have ideas on what to do with this data!

0xpgm

Still waiting to see how the VC-funded company behind Uv will make money.

Before that, I wouldn't want to be too dependent on it.

drcongo

This is all publicly available if you look.

make3

huggingface scares me in that regards

NSPG911

uv still has some issues, it cannot pull from global installations like pip, so on termux, something like tree sitter cannot be installed, because tree sitter is provided by apt/pkg

wishitwerentso

It makes using python tolerable. Thank you Rust and the Astral team!

gatvol

UV is super fast and great for environment management, however it's not at all well suited to a containerised environment, unless I'm missing something fundamental (unless you like using an env in your container that is).

nickjj

uv works great in a container, you can tell it to skip creating a venv and use the system's version of Python (in this case, let's say Python 3.14 from the official Python image on the Docker Hub).

The biggest wins are speed and a dependable lock file. Dependencies get installed ~10x faster than with pip, at least on my machine.

Both of my Docker Compose starter app examples for https://github.com/nickjj/docker-flask-example and https://github.com/nickjj/docker-django-example use uv.

I also wrote about making the switch here: https://nickjanetakis.com/blog/switching-pip-to-uv-in-a-dock...

scuff3d

Works fine in containers as far as I can tell. I don't even bother configuring it to not use a venv. Doesn't hurt anything

bognition

Why not?

In my docker files I use `uv sync` to install deps vs `pip install -f requirements.txt`

And then set my command to `uv run my_command.py` vs calling Python directly.

amingilani

> it's not at all well suited to a containerised environment

Could you elaborate?

zahlman

> unless you like using an env in your container that is

A virtual environment, minimally, is a folder hierarchy and a pyvenv.cfg file with a few lines of plain text. (Generally they also contain a few dozen kilobytes of activation scripts that aren't really necessary here.) If you're willing to incur the overhead of using a container image in the first place, plus the ~35 megabyte compiled uv executable, what does a venv matter?

__float

Why not use a virtualenv in your container?

emeraldd

This is still a complete pain to work with. Virtualenv in general is a "worst of worlds" solution. It has a lot of the same problems as just globally pip installing packages, requires a bit of path mangling to work right, or special python configs, etc. In the past, it's also had a bad habit of leaking dependencies, though that was in some weird setups. It's one of the reasons I would recommend against python for much of anything that needs to be "deployed" vs throw away scripts. UV seems to handle all of this much better.

ghshephard

I'm intrigued. I've been using virtualenv in numerous companies for about 8 years, traditionally wrapped in virtualenvwrappers, and now in uv.

UV doesn't change any of that for me - it just wraps virtualenv and pip downloads dependencies (much, much) more quickly - the conversion was immediate and required zero changes.

UV is a pip / virtualenv wrapper. And It's a phenomenal wrapper - absolutely changed everything about how I do development - but under the hood it's still just virtualenv + pip - nothing changed there.

Can you expand on the pain you've experienced?

Regarding "things that need to be deployed" - internally all our repos have standardized on direnv (and in some really advanced environments, nix + direnv, but direnv alone does the trick 90% of the time) - so you just "cd <somedir>", direnv executes your virtualenv and you are good to go. UV takes care of the pip work.

Has eliminated 100% use of virtualenvwrappers and direct-calls to pip. I'd love to hear a use case where that doesn't work for you - we haven't tripped across it recently.

zahlman

> UV is a pip / virtualenv wrapper.

Not quite; it reimplements the pip functionality (in a much smarter way). I'm pretty sure it reimplements the venv functionality, too, although I'm not entirely sure why (there's not a lot of room for improvement).

("venv" is short for "virtual environment", but "virtualenv" is specifically a heavyweight Python package for creating them with much more flexibility than the standard library option. Although the main thing making it "heavyweight" is that it vendors wheels for pip and Setuptools — possibly multiple of each.)

zahlman

> It has a lot of the same problems as just globally pip installing packages

No, it doesn't. It specifically avoids the problem of environment pollution by letting you just make another environment. And it avoids the problem of interfering with the system by not getting added to sys.path by default, and not being in a place that system packages care about. PEP 668 was specifically created in cooperation between the Python team and Linux distro maintainers so that people would use the venvs instead of "globally pip installing packages".

> requires a bit of path mangling to work right, or special python configs, etc. In the past, it's also had a bad habit of leaking dependencies, though that was in some weird setups.

Genuinely no idea what you're talking about and I've been specifically studying this stuff for years.

> It's one of the reasons I would recommend against python for much of anything that needs to be "deployed" vs throw away scripts. UV seems to handle all of this much better.

If you're using uv, you're using Python.

emeraldd

> by letting you just make another environment

This is actually what I'm talking about .. Why do I need a whole new python environment rather than just scoping the dependencies of an application to that application? That model makes it significantly harder to manage multiple applications/utilities on a machine, particularly if they have conflicting package versions etc. Being able to scope the dependencies to a specific code base without having to duplicate the rest of the python environment would be much better than a new virtualenv.

kstrauser

> Why do I need a whole new python environment rather than just scoping the dependencies of an application to that application?

But… that’s what a virtualenv is. That’s the whole reason it exists. It lets you run 100 different programs, each with its own different and possibly conflicting dependencies. And yeah, those dependencies are completely isolated from each other.

coeneedell

I haven’t really had this issue. UV’s recommendation is to mount the uv.lock and install those manages package versions to the container’s global pip environment. We haven’t had much issue at my work, where we use this to auto-manage python developer’s execution environments at scale.

m000

> UV’s recommendation is to mount the uv.lock and install those manages package versions to the container’s global pip environment.

Source? That's an option, but it's not even explicitly mentioned in the related documentation [1].

[1] https://docs.astral.sh/uv/guides/integration/docker/

rtpg

https://docs.astral.sh/uv/guides/integration/docker/#interme...

You kind of have to read between the lines and "know" this is a good solution and then when you see it it's like "right of course".

cyberax

Mounting uv.lock doesn't actually work if you have intra-repository dependencies. UV can't deal with packages that lack metadata (because it's not mounted): https://github.com/astral-sh/uv/issues/15715

nomel

The only reason I haven't switched is its still barely-there support for air-gapped systems [1].

And lack of non-local venv support [2].

[1] https://github.com/astral-sh/uv/issues/10203

[2] https://github.com/astral-sh/uv/issues/1495

hodanli

Same here. I still use conda from time to time for this.

diath

> (unless you like using an env in your container that is).

What's the problem with that?

You just make your script's entry point be something like this:

    uv venv --clear
    uv sync
    uv run main.py
nodesocket

Doesn’t this mean pid 0 in the container is uv instead of python? Does uv run just spawn a child python process?

diath

> Does uv run just spawn a child python process?

Yes, I suppose you could use it in conjunction with something like https://github.com/krallin/tini.

nicwolff

    ENV UV_SYSTEM_PYTHON=1
aaronbrethorst

I'm stuck on poetry until Snyk adds support for uv. Ugh. If anyone from Snyk is reading this, please go yell at whoever Jacob is: https://support.snyk.io/s/question/0D5PU00000u1G4n0AE/suppor...

bkettle

Semgrep has supported uv for months now (I added it).

whalesalad

uv kinda drives me nuts with the hoops you have to go through to use a pre-existing virtual environment. it seems really keen on doing that itself.

scuff3d

Why not just delete the virtual environment and let uv reinstall it?

hansonkd

What about activating the old virtualenv and using --active flag?

Made by @calebRussel