Settings

Theme

uv downloads overtake Poetry for Wagtail users

wagtail.org

228 points by ThibWeb 10 months ago · 205 comments

Reader

heisig 10 months ago

I recently switched to uv, and I cannot praise it enough. With uv, the Python ecosystem finally feels mature and polished rather than like a collection of brittle hacks.

Kudos to the uv developers for creating such an amazing piece of software!

  • matsemann 10 months ago

    Yeah, switched to writing python professionally ~4 years ago, and been low key hating the ecosystem. From a java and javascript background, it's mostly been npm/mvn install and it "just works". With python, there's always someone being onboarded that can't get it to work. So many small issues. Have to have the correct version per project, then have to get the venv running. And then installing it needs to build stuff because there's no wheel, so need to set up a complete c++ and rust toolchain etc., just to pull a small project and run it.

    uv doesn't solve all this, but it's reduced the amount of ways things can go wrong by a lot. And it being fast means that the feedback-loop is much quicker.

    • gunalx 10 months ago

      I cannot share the same experiences. mvn is a buggy mess, randomly forgetting dependencies, and constantly needing a full clean to not die on itself. npm and the entire js ecosystem feels so immature with constant breaking changes, and circular dependency hell, when trying to uppgrade stuff.

      • PaulHoule 10 months ago

        I've seen mvn projects that spin like a top and others that were a disaster.

        I think it's little recognized that there is a scaling limit for snapshots. If you have 20 people developing 20 projects and they are co-located in the same room with the server builds work 50-80% of the time and people think it's fine. If you're the one guy who is remote and has a slow connection, builds work 0% of the time. The problem is that at slightly different times you get slightly different snapshots that aren't compatible with each other -- it's a scaling problem because if you add enough developers and enough projects it will eventually get you.

        I've worked at other places when the mvn clean was necessary every time; other developers thought this shouldn't be necessary and I was a doofus except I was able to make consistent progress like a ratchet on the project and get it done and they weren't.

        Where I am now mvn is just fine, whenever it screws up there's a rational explanation and we're doing it wrong.

      • Etheryte 10 months ago

        That's an issue with the packages themselves though, not with package management as a whole. You and the comment above you are talking about different things. While there's plenty of pain to be had with npm, if you have a project that used to work years ago, you can generally just clone, install and be done, even if on older versions. On Python this used to mean a lot of hurt, often even if it was a fresh project that you just wanted to share with a colleague.

        • TeMPOraL 10 months ago

          For value of "years" greater than 1?

          Node/NPM was a poster child of an ecosystem where projects break three times a week, due to having too many transitive dependencies that are being updated too often.

          • Etheryte 10 months ago

            This argument makes no sense. Your dependencies don't change unless you change them, npm doesn't magically update things underneath you. Things can break when you try to update one thing or another, yes, but if you just take an old project and try and run it, it will work.

            • TeMPOraL 10 months ago

              Assuming the downloads still exist? Does NPM cache all versions it ever distributed?

              That's always one major thing I saw breaking old builds: old binaries stop being hosted, forcing you to rebuild them from old source, which no longer builds under current toolchains - making you either downgrade the toolchain that itself may be tricky to set up, or upgrade the library, which starts a cascade of dependency upgrades.

              It's not like Node projects are distributed with their deps vendored; there's too much stuff in node_modules.

              • Etheryte 10 months ago

                > Does NPM cache all versions it ever distributed?

                Yes it does, that's the whole point. You can still go and install the first version of express ever put on npm from 12 years ago. You can also install any of the 282 releases of it that have ever been put on npm since then. That's the whole point of a registry, it wouldn't be useful if things just disappeared at some random point in time.

                The only packages that get removed are malware and such, and packages which the vendor themselves manually unpublish [0]. The latter has a bunch of rules to ensure packages that are actually used don't get removed, please see the link below.

                [0] https://docs.npmjs.com/policies/unpublish

                • aitchnyu 10 months ago

                  IIRC there is a package whose whole point is to include everything else in its package.json and make them ineligible for unpublish.

              • lucianbr 10 months ago

                You're using a different, not hosted anymore package, three times a week? That's somewhere between very unusual and downright absurd.

                Yes you can find edge cases with problems. Using this as an argument for "breaks 3 times per week" does not hold.

                • TeMPOraL 10 months ago

                  No, I was using this as an argument for why I don't expect Node projects older than a year or two to be buildable without significant hassle.

                  (Also note that outside the web/mobile space, projects that weren't updated in a year are still young, not old. "Old" is more like 5+ years.)

                  The two things are related. If your typical project has a dependency DAG of 1000+ projects, a bug or CVE fix somewhere will typically cause a cascade of potentially breaking updates to play out over multiple days, before everything stabilizes. This creates pressure for everyone to always stay on the bleeding edge; with a version churn like this, there's only so many old (in the calendar sense) package dists that people are willing to cache.

                  This used to be a common experience some years back. Like many others, I gave up on the ecosystem because of the extreme fragility of it. If it's not like that anymore, I'd love to be corrected.

                  • afiori 10 months ago

                    I don't know if it is still as fragile as you remember but if you just never update your package-lock then it is super stable as you (transitive) dependencies never change.

                    The non-trivial exception being if some dependecy was downloading resources on the fly (maybe like a browser compat list) or calling system libraries (eg running shell commands)

            • nicoburns 10 months ago

              > npm doesn't magically update things underneath you

              It used to prior to npm 5 when lockfiles were introduced (yarn introduced lockfiles earlier).

          • wfme 10 months ago

            Projects breaking so frequently on npm and node is simply not the case unless you are trying upgrade an old project, one dependency per day…

      • matsemann 10 months ago

        I'm not saying mvn or npm is perfect. But the issues they have are consistent. My coworker and I would either have the same issues or not any issues. But with python it's probably more ways of running the project in the team than there are people, all with small tweaks to get it working on their system.

    • notpushkin 10 months ago

      Python has been mostly working okay for me since I switched to Poetry. (“Mostly” because I think I’ve run into some weird issue once but I’ve tried to recall what it was and I just can’t.)

      uv felt a bit immature at the time, but sounds like it’s way better now. I really want to try it out... but Poetry just works, so I don’t really have an incentive to switch just yet. (Though I’ve switched from FlakeHeaven or something to Ruff and the difference was heaven and hell! Pun in’tended.)

      • ThibWebOP 10 months ago

        A lot of Wagtail usage is with Poetry. Tends to be projects with 30-50 dependencies. It "just works" but we see a lot of people struggle with common tasks ("how do I upgrade this package"), and complain about how slow it is. I don’t have big insights outside of Wagtail users but I don’t think it’s too different.

        • raihansaputra 10 months ago

          n=1 but i've tried "manual" .venv, conda/miniconda, pipenv, poetry, and finally now at uv. uv is great. poetry feels like it's focused on people who are publishing packages. uv is great to use for personal dev, spinning up/down lots of venv, speedy, and uvx/uv script is very convenient compared to having all my sandbox project in one poetry env.

  • Wulfheart 10 months ago

    Ok, you convinced me to give it a try. Tbh, I am a casual user of python and I don't want to touch it unless I have a damn good reason to use it.

    • mlnj 10 months ago

      You do not need a damn good reason for this. Just try it out on a simple hello world. Then try it out on a project already using poetry for eg.

      uv init

      uv sync

      and you're done

      I'd say if you do not run into the pitfalls of a large python codebase with hundreds of dependencies, you'll not get the bigger argument people are talking about.

      • stavros 10 months ago

        I don't think you need to sync, do you? It always just does it when running.

        That said, I do wish uv had `uv activate`. I like just working in the virtualenv without having to `uv run` everything.

        • hobofan 10 months ago

          I do usually include instructions in our READMEs to do a `uv sync` as install command, in order to separate error causes, and also to allow for bootstrapping the venv so that it's available for IDEs.

        • drcongo 10 months ago

          You can still `source .venv/bin/activate(.fish)` and skip the uv run bit. I have Fish shell configured to automatically activate a .venv if it finds one in a directory I switch to.

          • stavros 10 months ago

            I do do that, can you please share your fish script to autoload it? I have something for Poetry envs, but not venv dirs.

            • drcongo 10 months ago

              Sure thing - so I mostly ended up using this for activating a .venv in a fabfile directory using this...

                  function __auto_fab --on-variable PWD
                      iterm2_print_user_vars
                      if [ -d "fabfile" ]
                          if [ -d "fabfile/.venv" ]
                              if not set -q done_fab
                                  and not set -q VIRTUAL_ENV
                                  echo -n "Starting fabfile venv... "
                                  pushd fabfile > /dev/null
                                  source .venv/bin/activate.fish  --prompt="[fab]"
                                  popd > /dev/null
                                  set -g done_fab 1
                                  echo -e "\r Fabfile venv activated         "
                              end
                          else
                              echo "Run gofab to create the .venv"
                          end
                      end
                  end
              
              
              I've since deleted the one to do a .venv in this directory, but I think it was roughly this...

                  function __auto_venv --on-variable PWD
                      if [ -d ".venv" ]
                          if not set -q done_venv
                              echo -n "Starting venv... "
                              source .venv/bin/activate.fish  --prompt="[venv]"
                              set -g done_venv 1
                              echo -e "\r Venv activated         "
                          end
                      end
                  end
              
              
              (just tested that and it seems to work - the --prompt actually gets overridden by the project name from uv's pyproject.toml now though so that's not really necessary, was useful at some point in the past)

              These live in ~/.config/fish/conf.d/events.fish

            • SAI_Peregrinus 10 months ago

              I'm not them, but I use `direnv` for this. Their wiki includes two layout_uv[1] scripts, one that uses `uv` just to activate a regular venv and a second that uses it to manage the whole project. I use the latter.

              [1] https://github.com/direnv/direnv/wiki/Python

              • stavros 10 months ago

                That's great, thanks! I use direnv but didn't know they had this.

                • SAI_Peregrinus 10 months ago

                  Custom layouts are awesome. You can set up any script to run when direnv runs, so you can support just about anything you want even before direnv adds a builtin.

        • 0cf8612b2e1e 10 months ago

          I keep going back and forth on ‘uv run’. I like being explicit with the tooling, but it feels like extra unneeded verbosity when you could just interact with the venv directly. Especially since I ported a bunch of scripts from ‘poetry run’

    • jorvi 10 months ago

      > I am a casual user of python and I don't want to touch it unless I have a damn good reason to use it.

      I... what? Python is a beautiful way to evolve beyond the troglodyte world of sh for system scripts. You are seriously missing out by being so pertinently against it.

      • OutOfHere 10 months ago

        Just you wait till someone shows you how Rust is to Python what Python is to shell scripts. For one, null safety is a major issue in most corporate Python code, and much less of an issue in Rust code.

        • jorvi 10 months ago

          Rust is decidedly not a scripting language.

          Don't get me wrong, Rust is great and I use it too, but for very different purposes than (system) scripts.

  • ffsm8 10 months ago

    Now, if I hadn't read literally the same message for Pipenv/Pipfile and poetry before, too...

    Python is going through package managers like JS goes through trends like classes-everywhere, hooks, signals etc

    • OutOfHere 10 months ago

      There have been incremental evolutionary improvements that were brought forth by each of the packages you named. uv just goes a lot further than the previous one. There have been others that deserve an honorary mention, e.g. pip-tools, pdm, hatch, etc. It's going to be very hard for anything to top uv.

  • amelius 10 months ago

    But how does it work with components that require libraries written in C?

    And what if there are no binaries yet for my architecture, will it compile them, including all the dependencies written in C?

    • matrss 10 months ago

      IMO if you require libraries in other languages then a pure python package manager like uv, pip, poetry, whatever, is simply the wrong tool for the job. There is _some_ support for this through wheels, and I'd expect uv to support them just as much as pip does, but they feel like a hack to me.

      Instead there is pixi, which is similar in concept to uv but for the conda-forge packaging ecosystem. Nix and guix are also language-agnostic package managers that can do the job.

      • amelius 10 months ago

        But for example, if I install the Python package "shapely", it will need a C package named GEOS as a shared library. How do I ensure that the version of GEOS on my system is the one shapely wants? By trial and error? And how does that work with environments, where I have different versions of packages in different places? It sounds a bit messy to me, compared to a solution where everything is managed by a single package manager.

        • dagw 10 months ago

          You are describing two different problems. Do you want a shapely package that runs on your system or do you want to compile shapely against the GEOS on your system. In case 1 it is up to the package maintainer to package and ship a version of GEOS that works with your OS, python version, and library version. If you look at the shapely page on pypi you'll see something like 40 packages for each version covering most popular permutations of OS, python version and architecture. If a pre-built package exists that works on your system, then uv will find and install it into your virtualenv and everything should just work. This does means you get a copy of the compiled libraries in each venv.

          If you want to build shapely against your own version of GEOS, then you fall outside of what uv does. What it does in that case is download the all build tool(s) specified by shapely (setuptools and cython in this case) and then hands over control to that tool to handle the actual compiling and building of the library. It that case it is up to the creator of the library to make sure the build is correctly defined and up to you to make sure all the necessary compilers and header etc. are set up correctly.

          • amelius 10 months ago

            In the first case, how does the package maintainer know which version of libc to use? It should use the one that my system uses (because I might also use other libraries that are provided by my system).

            • dagw 10 months ago

              The libc version(s) to use when creating python packages is standardised and documented in a PEP, including how to name the resulting package to describe the libc version. Your local python version knows which libc version it was compiled against and reports that when trying to install a binary package. If no compatible version is found, it tries to build from source. If you are doing something 'weird' that breaks this, you can always use the --no-binary flag to force a local build from source.

        • stabbles 10 months ago

          You could use a package manager that packages C, C++, Fortran and Python packages, such as Spack: here's the py-shapely recipe [1] and here is geos [2]. Probably nix does similar.

          [1]: https://github.com/spack/spack/blob/develop/var/spack/repos/... [2]: https://github.com/spack/spack/blob/develop/var/spack/repos/...

        • matrss 10 months ago

          That's what I mean, in this case pip, uv, etc. are the wrong tool to use. You could e.g. use pixi and install all python and non-python dependencies through that, the conda-forge package of shapely will pull in geos as a dependency. Pixi also interoperates with uv as a library to be able to combine PyPI and conda-forge packages using one tool.

          But conda-forge packages (just like PyPI packages, or anything that does install-time dependency resolution really) are untestable by design, so if you care for reliably tested packages you can take a look at nix or guix and install everything through that. The tradeoff with those is that they usually have less libraries available, and often only in one version (since every version has to be tested with every possible version of its dependencies, including transitive ones and the interpreter).

          All of these tools have a concept similar to environments, so you can get the right version of GEOS for each of your projects.

          • amelius 10 months ago

            Indeed, I'd want something where I have more control over how the binaries are built. I had some segfaults with conda in the past, and couldn't find where the problem was until I rebuilt everything from scratch manually and the problems went away.

            Nix/guix sound interesting. But one of my systems is an nVidia Jetson system, where I'm tied to the system's libc version (because of CUDA libraries etc.) and so building things is a bit trickier.

            • dagw 10 months ago

              with uv (and pip) you can pass the --no-binary flag and it will download the source code and build all you dependencies, rather than downloading prebuilt binaries.

              It should also respect any CFLAGS and LDFLAGS you set, but I haven't actually tested that with uv.

              • amelius 10 months ago

                I just tried --no-binary with the torchvision package (on a Jetson system). It failed. Then I downloaded the source and it compiled without problems.

        • macNchz 10 months ago

          This type of situation is why I use Docker for pretty much all of my projects—single package managers are frequently not enough to bootstrap an entire project, and it’s really nice to have a central record of how everything needed was actually installed. It’s so much easier to deal with getting things running on different machines, or things on a single machine that have conflicting dependencies.

          • OutOfHere 10 months ago

            Docker is good for deployment, but devcontainer is nice for development. Devcontainer uses Docker under the hood. Both are also critically important for security isolation unless one is explicitly using jails.

        • imtringued 10 months ago

          What exactly prevents you from creating your own packages if you want to use your system package manager?

          On Alpine and Arch Linux? Exactly nothing.

          On Debian/Ubuntu? maybe the convoluted packaging process, but that's on you for choosing those distributions.

    • dagw 10 months ago

      UV is not (yet) a build system and does not get involved with compiling code. But easily lets you plug in any build system you want. So it will let you keep using whatever system you are currently using for building your C libraries. For example I use scikit-build-core for building all of my libraries C and C++ components with cmake and it works fine with uv.

      • datadeft 10 months ago

            uv build
            Building source distribution...
            running egg_info
            writing venv.egg-info/PKG-INFO
            Successfully built dist/venv-0.1.0.tar.gz
            Successfully built dist/venv-0.1.0-py3-none-any.whl
        • dagw 10 months ago

          I guess it depends on what you mean by a build system. From my understanding uv build basically just bundles up all the source code it finds, and packages it into a .whl with the correct metadata. It cannot actually do any build steps like running commands to compile or transform code or data in any way. For that you need something like setuptools or scikit-build or similar. All of which integrate seamlessly with uv.

          • sirfz 10 months ago

            It actually does exactly what pip does depending on your configured build backend, so if you have your pyproject.toml/setup.py configured to build external modules, `uv build` will run that and build a binary wheel

            • dagw 10 months ago

              Yes, that's my point. You need to bring your own 'real' build system to make uv doing anything non-trivial. And the fact that this work transparently with uv is a very good thing.

          • datadeft 10 months ago

            I see what you mean. You can use it with mise that has build support.

    • sirfz 10 months ago

      Yes it'll build any dependency that has no binary wheels (or you explicitly pass --no-binary) as long as said package supports it (i.e. via setup.py/pyproject.toml build-backend). Basically, just like pip would

  • freeamz 10 months ago
    • the_mitsuhiko 10 months ago

      Unlike uv this tool is unlikely to solve problems for the average Python user and most likely will create new ones.

      • freeamz 10 months ago

        Agree, however for user who want to get faster speed out of python wouldn't that just work with rustpython? It can also run in the browser then.

        • chippiewill 10 months ago

          RustPython is just an interpreter written in Rust. There's no reason why it would be meaningfully faster than CPython just because it's written in Rust rather than C. Rust adds memory safety, not necessarily speed.

          A new and immature interpreter is going to have other problems:

          - Lack of compatibility with CPython - Not up to date with latest version features - Incompatibility with CPython extensions

          RustPython is a cool project, but it's not reached the big time yet.

IshKebab 10 months ago

Not a surprise. I said it before and I'll say it again, all the competing projects should just shut up shop for the good of Python. uv is so much better it's like pushing penny farthings after the safety bike has been invented.

  • nikisweeting 10 months ago

    That's rough for all the creators of poetry, pdm, pipenv, etc. to hear. They put in a ton of great work over the last decade, but I fear you may be right.

    • alwyn 10 months ago

      I quite really like pdm! I can see why maybe poetry but especially pipenv might be replaced with uv, but what's the value of uv over pdm beyond performance? It ticks all my boxes otherwise.

      • fnord123 10 months ago

        >but what's the value of uv over pdm beyond performance

        uv is not written in python so it doesn't suffer from the bootstrap problem of having a python version installed to begin using it. Users (new and even experienced) get confused and annoyed when they try to use python tooling in the same venv as their application instead of using pipx.

        People also get confused and annoyed if they use mac and run `brew upgrade` and find themselves with python 3.13 or just any version that is new (yes we can pin to python@3.11 or whatever) so pyenv is a good option.

        So now you have pdm, pipx, and pyenv to manage all this stuff. With uv all this hassle goes away.

      • quickslowdown 10 months ago

        I came to uv from pdm, and the only reason I switched is the sheer speed and simplicity of uv. Pdm is such a great utility, and it can use uv as the package solver, but uv still has it beat on raw speed, and it feels simpler to use (whether or not it actually is).

      • ZuLuuuuuu 10 months ago

        I am feeling the same way about PDM, it works very well, easy to configure and checks all the boxes feature-wise.

      • francasso 10 months ago

        Beyond performance? Performance!

      • nikisweeting 10 months ago

        pdm is actually my favorite too, I used it on ArchiveBox for years and loved it. I still use it as the build backend instead of hatch in some places

    • slightwinder 10 months ago

      They served their purpose for the decade, so they can be happy that they did their thing to pave the road for a good successor. uv some day will also find it successor, this is how software lives. Celebrate the life, don't cry for how it ends.

    • rendaw 10 months ago

      Oh yeah, pipenv which was a shoddy mess that used personal connections and reputation to get promoted on the python website and poetry where the developer did a good job dismissing requests to support common use cases (like overriding dependencies).

qwertox 10 months ago

I've read so much positive feedback about uv, that I'd really like to use it, but I'm unsure if it fits my needs.

I was heavily invested into virtualenv until I had to upgrade OS versions, which upgraded the Python versions and therefore broke the venvs.

I tried to solve this by using pyenv, but the need of recompiling Python on every patch wasn't something which I would accept, specially in regards to boards like Raspberry Pis.

Then I tried miniconda which I initially only liked because of the precompiled Python binaries, and ultimately ended up using pyenv-managed miniforge so that I could run multiple "instances" of miniforge and therefore upgrade miniforge gradually.

Pyenv also has a plugin which allows to set suffixes to environments, which allows me to have multiple miniforges of the same version in different locations, like miniforge-home and miniforge-media, where -home has all files in the home dir and -media has all files on a mounted nvme, which then is where I put projects with huge dependencies like CUDA inside, not cluttering home, which is contained in a VM image.

It works really great, Jupyter and vscode can use them as kernels/interpreters, and it is fully independent of the OS's Python, so that OS upgrades (22.04 -> 24.04) are no longer an issue.

But I'm reading about all these benefits of uv and wish I could use it, but somehow my setup seems to have tied my hands. I think I can't use uv in my projects.

Any recommendations?

Edit: Many of my projects share the same environment, this is absolutely normal for me. I only create a new environment if I know that it will be so complex that it might break things in existing environments.

  • the_mitsuhiko 10 months ago

    I’m a bit confused why uv is not an option for you. You don’t need to compile Python, it manages virtualenvs for you, you can use them with Jupyter and vscode. What are you missing?

    • qwertox 10 months ago

      So the only difference is that Conda also isolates "system" libraries (like libcublasLt.so), or does uv also do this?

      It's not that uv is not an option for me, I made this move to miniforge before uv was on my radar because it wasn't popular, but I'm still at a point where I'm not sure if uv can do what I need.

      • dharmab 10 months ago

        Try Pixi (https://prefix.dev/). It uses uv for Python while also managing your other libraries from Conda. It has a migration path from Conda.

      • rmholt 10 months ago

        According to these docs

        https://docs.astral.sh/uv/pip/environments/

        I think uv supports conda envs

      • the_mitsuhiko 10 months ago

        uv does not ship system libraries because pypi does not have them. There is a philosophical difference between pypi and conda today. I believe over time pypi will likely ship some system libraries but we will see.

        • qwertox 10 months ago

          So uv is resticted to pypi but does offer isolated Python installations, with precompiled Python binaries?

          • chippiewill 10 months ago

            Yeah it has precompiled Python binaries. They're custom standalone builds of CPython: https://github.com/astral-sh/python-build-standalone

            You can also import existing Python versions into uv, for example on my Mac it has imported the Homebrew versions.

          • the_mitsuhiko 10 months ago

            It doesn’t “restrict” to pypi but it wants to be rooted in pypi. That means if you install “tensorflow” you get it from there. The state of pypi is the state of pypi and I have some hope that this also will improve. See for instance the efforts that go into “wheel next”.

  • be7a 10 months ago

    Have you checked out https://github.com/prefix-dev/pixi? It's built by the folks who developed Mamba (a faster Conda implementation). It supports PyPI dependencies using UV, offers first-class support for multi-envs and lockfiles, and can be used to manage other system dependencies like CUDA. Their CLI also embraces much of the UX of UV and other modern dependency management tools in general.

  • datadeft 10 months ago

    I have moved to uv few months back and never looked back. I use it with venv and it works very well. There is a new environment handling way with uv:

    - uv init new-py-env

    - cd new-py-env

    - uv add jupyter

    - uv build

    These are executed super fast. Not sure if this could help your situation but it is worth to be aware of these.

  • secondcoming 10 months ago

    The python ecosystem has become a disaster. Even reading your post gave me a headache.

mihaic 10 months ago

I keep reading praise about uv, and every single time I never really understand what problems it addresses.

I've got a couple quite big Django projects for which I've used venv for years, and not once have I had any significant issues with it. Speed at times could have been better and I would have liked to have a full dependency list lock file, but that never caused me issues.

The only thing that comes to mind is those random fails to build of C/C++ dependencies. Does uv address this? I've always seen people rave about other benefits.

  • chippiewill 10 months ago

    The benefit that uv adds is it's a one-stop-shop that's also wicked fast.

    If you use venv then you have extra steps because you have to explicitly create the venv, then explicitly install the deps there with pip. If your project is designed for a specific python version then developers have to manage that separately (usually pyenv these days).

    For people building apps uv replaces venv, pip and pyenv, while being way faster at doing all three of those (you can completely rebuild the virtualenv and install the dependencies from scratch in under a second usually because uv is faster at creating a virtualenv than venv and is very quick at relinking the dependencies from a package cache).

  • hansihe 10 months ago

    What makes it so great for me is the effortlessness.

    I often use Python for quick one off scripts. With UV I can just do `uv init`, `uv add` to add dependencies, and `uv run` whatever script I am working on. I am up and running in under a minute. I also feel confident that the setup isn't going to randomly break in a few weeks.

    With most other solutions I have tried in the Python ecosystem, it always seemed significantly more brittle. It felt more like a collection of hacks than anything else.

  • ashikns 10 months ago

    I'm in the same boat. Sure it's nice and better, but I haven't felt so much annoyance with the python ecosystem that I desperately need something better. I use VS Code and it takes care of venv automatically, so I am biased by that.

brylie 10 months ago

As an aside, I can't praise the Wagtail CMS highly enough. It sets a high bar for usability and accessibility of the auto-generated content management UI.

The developer experience is top notch with excellent documentation and many common concerns already handled by Wagtail or Django. A significant amount of Wagtail-specific code is declarative, essentially describing data model, relationships, and UI fields. The parts that you don't need stay out of the way. It's also agnostic of the type of front-end you want, with full and automatic support for headless mode with JavaScript client, using traditional Django templates SSR, or using a dynamic approach like HTMX.

Kudos to the Wagtail team!

  • ThibWebOP 10 months ago

    ty! We have no plans to rewrite Wagtail in Rust but I hope there’s ways in which we can make the developer experience better, particularly around dependencies management

ZuLuuuuuu 10 months ago

PyCharm also added uv support in their latest versions.

We recently switched to PDM in our company because it worked very well in our tests with different package/dependency managers. Now I am rethinking if we should switch to uv while PDM usage is still not very wide-spread in our company. But PDM works very well, so I am not sure whether to keep using it.

  • ThibWebOP 10 months ago

    With the caveat I only have the package installers usage data for Wagtail downloads – pdm usage has fallen off a cliff, from 0.2% of downloads in January 2024, to 0.01% in January 2025. Roughly matches the uptake of uv.

    Doesn’t make pdm bad in itself but that means there’ll be fewer pdm users around to report bugs, potentially fewer contributors to it too, fewer resources, etc.

    • ZuLuuuuuu 10 months ago

      Indeed, on one hand PDM works great, but on the other hand we wouldn't want to choose a package manager which might not be maintained anymore after a few years because there are just not many users of it.

  • chippiewill 10 months ago

    Back when PDM was still pushing __pypackages__ for standardisation I think PDM made sense, but honestly I don't think it adds anything over uv and is just going to be slower for the most part.

BerislavLopac 10 months ago

As much as I am glad that it looks like one solution is being more and more accepted as the golden standard, I'm a little disappointed that PDM [0] -- which has been offering pretty much everything uv does for quite some time now -- has been completely overlooked. :(

[0] https://pdm-project.org

TOMDM 10 months ago

For the uninitiated what is the benefit of UV over pip?

I've been working with pip for so long now that I barely notice it unless something goes very wrong.

  • NeutralForest 10 months ago

    - uv is aware of your dependencies, you can add/remove development dependencies, create group of development dependencies (test, lint, dev, etc) and add or remove those and only those at will. You can add dependencies and optional dependencies for a project as well, think my_app[cli,standard]. You don't need to have different requirements.txt for each case nor do you need to remove things by hand as you'd do in pip, since it doesn't remove deps when you remove a package for example. As a result, you can remove {conda,poetry,...} from your workflows.

    - uv can install python and a virtualenv for you. Any command you run with `uv run` from the root of a repo will be aware of its environment, you don't even need to activate a virtualenv anymore. This replaces {pyenv, pyenv-virtualenv, virtualenvwrapper,...}.

    - uv follows the PEPs for project config (dependencies, optional dependencies, tool configs) in the pyproject.toml so in case uv dies, it's possible to migrate away for the features are defined in the PEPs. Which is not the case for say, poetry.

    - uv has a lock file and it's possible to make deps platform specific (Windows, Linux, MacOS, etc). This is in compliance with a PEP but not supported by all tools.

    - uv supports custom indexes for packages so you can prefer a certain index, for example your company package index or pytorch's own index (for ML work).

    - very fast, makes local dev very seamless and is really helpful in CI/CD where you might just setup and tear down python envs a lot.

    Also, the team is responsive on Github so it's easy to get help.

    • adrian17 10 months ago

      Does this also replace, or work well with tox? We currently use it to run basic CI/local workflows (`tox -e lint` for all linters, `tox -e py310`, `tox -e py312` to run tests suites on chosen interpreters' environments), and to set up a local environment with package installed in-place (so that we can run `myprogram -arg1 -arg2` as if it was installed via `pip`, but still have it be editable by directly editing the repo).

      With how much the ecosystem is moving, I don't know whether the way we're doing it is unusual (Django and some other big projects still have a tox.ini), obsolete (I can't find how ux obsoletes this), or perfectly fine and I just can't find how to replace pip with ux for this use case.

      • NeutralForest 10 months ago

        I'm not personally releasing a ton of internal packages where I work but I know of https://github.com/tox-dev/tox-uv. Haven't tried it yet though but it seems to do what you want. I also saw that nox (tox but in python instead of a tox.ini file https://nox.thea.codes/en/stable/config.html), is supporting uv from what I understand.

        I don't think there's a definite answer yet.

        • sco1 10 months ago

          tox-uv has been a great selling point for my personal use of uv. I'm typically testing across 4-5 different versions of Python and the build speedup has been significant.

      • quickslowdown 10 months ago

        Uv works fine with tox, but have you tried nox? I only dipped my toes in tox, but I found nox around the same time and gravitated to it. I replaced PDM's "scripts" concept with nox sessions. I have a project where most of the functionality is nox sessions I call in CI pipelines. Writing sessions in pure python opens so many doors.

      • chippiewill 10 months ago

        I think the uv team intend to have a solution around tox (almost certainly replacing it), but haven't done so yet.

    • TOMDM 10 months ago

      Honestly this sounds more likely to replace some workflows I historically would have done with Docker.

      The pain of creating a python environment that is durable across different deployments had me going for the nuclear option with full containerisation.

    • mbeex 10 months ago

      ...

      - uv tool replaces pipx etc.

      - uv pip --tree replaces pipdeptree (including 'inverse' mode)

      - ...

  • rschiavone 10 months ago

    Not only it's faster, it also provides a lock file, `uvx tool_name` just like `npx`, and a comprehensive set of tools to manage your Python version, your venv and your project.

    You don't need `pyenv`, `poetry` and `pipx` anymore, `uv` does all of that for you.

  • shellac 10 months ago

    > over pip

    It's a much more complete tool than pip. If you've used poetry, or (in other languages) cargo, bundler, maven, then it's like that (and faster than poetry).

    If you haven't, in addition to installing dependencies it will manage and lock their versions (no requirements.txt, and much more robust), look after the environment (no venv step), hold your hand creating projects, and probably other things.

    Edit to add: the one thing it won't do is replace conda et al, nor is it intended to.

  • atoav 10 months ago

    The problems start as soon as your scripts should run on more than your own computer.

    If you pip install something, you install it on the system python (the python binary located at sys.executable). This can break systems if the wrong combination of dependencies comes together. This is why you should never install things via pip for other people, unless you asked them first.

    Now how else would you install them? There is a thing called virtual environments, which basically allows you to install pip dependencies in such way, they are only there within the context of the virtual environment. This is what you should do when you distribute python programs.

    Now the problem is how do you ensure that this install to the virtual environment uses specific versions? What happens when one library depends on package A with version 1.0 and another library depends on a package with version 2.0? Now what happens if you deploy that to an old debian with an older python version.. Before uv I had to spend literal days to resolve such conflicts.

    uv solves most of these problems in one unified place, is extremely performant, just works and when it does not, it tells you precisely why.

  • BiteCode_dev 10 months ago

    The whole explaination is here: https://www.bitecode.dev/p/a-year-of-uv-pros-cons-and-should

    The td;rd is that is has a lot less modes of failure.

  • montebicyclelo 10 months ago

    It brings way more to the table than just being fast, like people are commenting. E.g. it manages Python for your projects, so if you say you want Python 3.12 in your project, and then you do 'uv run python my script.py', it will fetch and run the version of Python you specified, which pip can't do. It also creates lock files, so you know the exact set of Python package dependencies that worked, while you specify them more loosely. Plus a bunch of other stuff..

  • globular-toast 10 months ago

    The only advantage over pip is it's faster. But the downside is it's not written in Python.

    The real point of uv is to be more than pip, though. It can manage projects, so basically CLI commands to edit your `pyproject.toml`, update a lockfile, and your venv all in one go. Unlike earlier tools it implements a pretty natural workflow on top of existing standards where possible, but for some things there are no standards, the most obvious being lockfiles. Earlier tools used "requirements.txt" for this which was quite lacking. uv's lockfile is cross-platform, although, admittedly does produce noisier diffs than requirements.txt, which is a shame.

    • chippiewill 10 months ago

      As a straight pip replacement, yeah it's mostly just faster. Although it does have a few breaking changes that make it more secure (it has a more predictable way of resolving packages that reduce the risk of package squatting).

  • jonatron 10 months ago

    Faster.

    • maratc 10 months ago

      Ok, and what's the advantage for the people who don't have "my pip is too slow" problem?

      • nicolasp 10 months ago

        Wait times are in the order of tens of milliseconds instead of seconds. That makes a massive difference in how nice uv is to use vs pip.

        • maratc 10 months ago

          That’s just the same “my pip is too slow” problem which some people don’t have.

          I work in a place with 200 developers, and 99% of pip usage is in automated jobs that last an hour. Shaving a couple seconds off that will not provide any tangible benefit. However moving 200 people from a tool they know to one they don’t comes at a rather significant cost.

          • nickjj 9 months ago

            > Shaving a couple seconds off that will not provide any tangible benefit.

            It could be more than that.

            I switched from pip to uv today in a Dockerized project with 45 total dependencies (top level + sub-dependencies included).

            pip takes 38 seconds and uv takes 3 seconds, both uncached. A 10x+ difference is massive and if uv happens to be better suited to run on multiple cores it could be even more because my machine is a quad core i5 3.20ghz from 10 years ago.

            > I work in a place with 200 developers

            In your case, if you have 200 developers spending an hour on builds that could in theory be reduced down to 5 minutes per build. That's 11,000 minutes or 183 hours of dev time saved per 1 build. I know you say it's automated but someone is almost always waiting for something right?

          • dagw 10 months ago

            For what it's worth uv is fully compatible with pip. just replace 'pip --foo bar' with 'uv pip --foo bar'. One project I'm working on is 100% 'classic' pip based with no plans of moving, but I still use uv when working on it as it is completely transparent. Uv manages my venvs and python versions and makes things like switching between different versions of python and libraries much smoother, and I can still use the same pip commands as everybody else, it's just that all my pip commands run faster.

            • notatallshaw 10 months ago

              > For what it's worth uv is fully compatible with pip

              Depends what you mean by "fully": https://docs.astral.sh/uv/pip/compatibility/

              There's a number of places pip and uv diverge:

              * uv makes some design choices that aren't always strictly compatible with the spec

              * uv correctly implements the spec and it turns out pip, or the underlying library, didn't (I have worked on fixing a couple of these on the pip side)

              * uv doesn't support legacy features still in pip

              * Tool specific features or exact output diverge

              This is not a criticism, but I've seen some users get irate with uv because they were under the impression that it was making much stronger compatibility guarantees.

      • stavros 10 months ago
    • prashnts 10 months ago

      I can't stress how fast it is when using on resource constrained envs like a Pi Zero.

      I intend to use system python there but previously poetry will simply crash the whole Pi while installing itself.

__mharrison__ 10 months ago

I just taught a week long course, Advanced Python for Data Scientists. The first day we discussed how to use uv. The feedback was "this UV content is worth the price of the whole course".

Using uv is an easy sell to anyone who has worked with Python.

Great work Charlie and team.

bsdz 10 months ago

I feel for me, at least one nice thing about poetry over uv is, that if I have an issue or feature extension, I can just write my own plugin in pure Python. With uv, I'd need to learn Rust in addition to python/c/c++/etc.

I wonder what it would take to get poetry on par with uv for those who are already switching to it? Poetry is definitely very slow downloading multiple versions of packages to determine dependencies (not sure how uv works around this?). Does uv have a better dependency checker algorithm?

  • chippiewill 10 months ago

    In this day and age you don't usually have to download the packages to resolve the dependencies as PyPI can usually expose it (unless you need to install from sdist which is less common these days).

    Dependency resolution is slow because it's computationally very expensive. Because uv is written in Rust the resolution is just much much faster. IIRC they actually reuse the same resolution package that Cargo (Rust's package manager) uses.

    • bsdz 10 months ago

      Yes I think I heard pypi started exposing dependency info so it makes sense to use that where possible.

      The dependency resolution computation is an interesting problem. I think poetry at some point switched to mypyc for compilation (although I can't find conclusive evidence for it now). From my experience, mypyc doesn't really improve performance much compared to say writing a c/c++ extension. Perhaps offloading dependency resolution in poetry to a native c library is a way to match uv.

  • wiseowise 10 months ago

    > I wonder what it would take to get poetry on par with uv

    Different laws of physics, to start with.

  • dagw 10 months ago

    I wonder what it would take to get poetry on par with uv for those who are already switching to it?

    Poetry and uv have quite different philosophy. Poetry is incredibly opinionated in how you should do things, and trying to make poetry fit an existing project or combining poetry with other tools is quite likely to break. Uv on the hand is far more flexible and easier to make work with your current workflow. For me that was the main reason I gave up poetry, and in that aspect poetry will probably never be 'on par' with uv since these aren't technical differences, but differences of philosophy.

    • raverbashing 10 months ago

      Curious what you found too opinionated about Poetry (not saying it doesn't happen)

      • dagw 10 months ago

        The way it assumes one and exactly one venv per project was a big one. The one that broke me however was trying to get it work with external build systems, specifically trying to compile C++ code as part of the build process. Another important one was that it doesn't play nicely with older code built on pip and pip based workflows. You basically have to start from scratch, while uv makes it much easier to slowly transition.

  • rmholt 10 months ago

    For me personally the killer uv feature is pyenv integration, which poetry doesn't do

    • bsdz 10 months ago

      Interesting - thanks. I use virtual environments and each has its own python version tied to it. Not sure if pyenv is useful to me but who knows perhaps one day. Good to know uv supports pyenv.

Euphorbium 10 months ago

Uv should replace pip for all I care.

  • ThibWebOP 10 months ago

    Well, seems like 100% what’s going to happen (for the majority of Wagtail users at least) if the current trend continues. I’m not sure if that’s a good thing to be frank. But we’ll have to adjust regardless.

    • darkfloo 10 months ago

      As a semi casual user of python that had to battle w/ dependency management recently, can you elaborate on why that would not be a good thing ? I thought about switching our project to uv but could not find the time necessary

      • ThibWebOP 10 months ago

        Sure – and I think it’s certainly proving to be a good thing so far! My concerns are more longer-term. I see two primarily:

        (1) As uv’s governance is driven by a for-profit company, I see incentives that will eventually compromise on its benefits.

        (2) Python packaging has historically been very fragmented, and more recently there’s been lots of work on standardization. That work will be impacted when users massively shift to one package installer.

        Neither of those things are clear negatives, but they’re worth being aware of.

        • chippiewill 10 months ago

          > That work will be impacted when users massively shift to one package installer.

          Charlie Marsh (who founded Astral that develops uv) is very engaged in the standardisation process around Python packaging. The whole idea around uv is to be something that follows the standards as much as possible. uv has been much more aggressive about conforming than the other package managers.

          • ThibWebOP 10 months ago

            yep, I really appreciate their current efforts, but still think it’s a point of concern. Feels risky to have so much of an ecosystem resting on so few people (bus factor, governance, etc). Hopefully with Astral being a for-profit business they’ll find ways for their work to be more sustainable than other package managers’ maintainers.

      • jonatron 10 months ago

        You may be overestimating the amount of time it takes to switch to uv.

        • blooalien 10 months ago

          Took me all of about 10 seconds after I decided to switch from Poetry and PipX. Been just learning it bit by bit as I go along and been really pleased with it thus far.

  • freeamz 10 months ago
    • cibyr 10 months ago

      What about it? RustPython is an alternative interpreter; it's not in the same category of thing as pip or uv.

emblaegh 10 months ago

Man, I’m so jealous of insane praise that uv (and most other astral tools) gets. I don’t think ever seen anything so unanimously lauded around here.

  • SilverSlash 10 months ago

    Guess people here don't talk much about cargo. I wouldn't be surprised to learn that cargo inspired uv. Rust with cargo showed for the first time that tooling _can_ be good, even for systems programming languages.

  • mvATM99 10 months ago

    With good reason honestly. They take all the best practices from existing tooling we had, discard the bad, and make it run blazingly fast.

    Ruff for me meant i could turn 4 pre-commit hooks (which you have to configure to be compatible with each other too) into just 1, and i no longer dread the "run Pylint and take a coffee break" moment.

    I jumped ship to UV recently. Though i was skeptical at first i don't regret it. It makes dependency management less of a chore, and just something i can quickly do now. Switching from Poetry was easy for me too, only package i had issues with was pytorch, but that just required some different toml syntax.

  • weberer 10 months ago

    Like with any social media site, you also have to consider the possibility that not all comments are 100% organic.

    • emblaegh 10 months ago

      I’ve seen fishy looking engagement in hn before, but I’m inclined to think uv’s praise is genuine. It reflects the collective relief of seeing an extremely long and painful journey finally come to an end (hopefully).

    • new_user_final 10 months ago

      I am giving you a organic comment. uv works just like magic. Python tooling is mess, uv is here to save you.

  • orthoxerox 10 months ago

    Tailscale?

    • emblaegh 10 months ago

      It’s not really on my radar, but I’d be curious to know what other pieces of software get similar respect from their communities.

  • DangitBobby 10 months ago

    Second only to SQLite. HN loves SQLite.

zoobab 10 months ago

Plenty of packages still fail trying to spawn cmake, gcc and al.

UV does not solve all the hard problems.

Maybe switch to Pixi?

  • quleap 10 months ago

    I'm extremely satisfied with Pixi. It fixes almost all the issues I had with conda and mamba. It supports both conda and pypi (via uv) packages. I don't know if uv fixes pip's dependency management hell. I settled on conda packages because pip was such a mess.

  • Vaslo 10 months ago

    Switch to who?

    • KingMob 10 months ago

      Google's not that broken yet: https://pixi.sh/latest/

      • sureglymop 10 months ago

        That (main) page doesn't mention python once, so personally I was immediately wondering if this is an alternative to tools like uv or more generally tools like mise and asdf. It really isn't that clear so could you try to elaborate a bit?

        • tasuki 10 months ago

          Hah, your comment persuaded me to look at that page, and honestly I also can't tell what it even is. I think it's supposed to be an alternative to `mise` and `asdf`, but it mostly mentions various Python tools? And doesn't seem to have an overview of what's available to install through Pixi?

          Then I clicked a link and got to https://prefix.dev/ ...

          > pixi is a fast software package manager built on top of the existing conda ecosystem. Spins up development environments quickly on Windows, macOS and Linux.

          Oh, build on top of conda. I am so going to stay the hell as far away from that as possible!

        • KingMob 10 months ago

          Google works for you too, friend.

          • sureglymop 10 months ago

            In fact, I elaborated in my comment that not even the homepage of the project itself makes it clear. Why would google, an unrelated website make it any clearer...

            • KingMob 10 months ago

              The broader point you're missing is, HN commenters are not your personal AI summarizers.

              You're literally on the Pixi website, and you know its name if you want to look for forums, feed it into Perplexity, etc.

DHolzer 10 months ago

I recently checked out UV, and it's impressively fast. However, one challenge that keeps coming up is handling anything related to CUDA and Torch.

Last week, I started developing directly in PyTorch containers using just pip and Docker. With GPU forwarding on Windows no longer being such a hassle, I'm really enjoying the setup. Still, I can’t shake the feeling that I might be overlooking something critical.

I’d love to hear what the HN crowd thinks about this type of env.

  • fluidcruft 10 months ago

    I assume you've seen this:

    https://docs.astral.sh/uv/guides/integration/pytorch/

    If the platform (OS) solution works for you that's probably the easiest. It doesn't for me because I work on multiple Linux boxes with differing GPUs/CUDAs. So I've use the optional dependencies solution and it's mostly workable but with an annoyance that uv sync forgets the --extra that have been applied in the venv so that if you "uv add" something it will uninstall the installed torch and install the wrong one until I re-run uv sync with the correct --extra again. (uv add with --extra does something different) And honestly I appreciate not having hidden venv states but it is a bit grating.

    There are some ways to setup machine/user specific overrides with machine and user uv.toml configuration files.

    https://docs.astral.sh/uv/configuration/files/

    That feels like it might help but I haven't figured out how to configure get that to help it pick/hint the correct torch flavor for each machine. Similar issues with paddlepaddle.

    Honestly I just want an extras.lock at this point but that feels like too much of a hack for uv maintainers to support.

    I have been pondering whether nesting uv projects might help so that I don't actually build venvs of the main code directly and the wrapper depends specifically on certain extras of the wrapped projects. But I haven't looked into this yet. I'll try that after giving up on uv.toml attempts.

  • jerrygenser 10 months ago

    I've used uv with pytorch and cuda fine. What problem have you had?

    I also use it in docker to build the container.

  • mbeex 10 months ago

    At least my kind of problems were solved by

    https://docs.astral.sh/uv/guides/integration/pytorch/#instal...

sireat 10 months ago

Coming from 10+ years of pip and also heavy venv user - uv seems pretty good.

First impressions of uv are quite nice, but how does one change Python versions once you have a project up?

I installed 3.13 with `uv python install python3.13`

I see bunch of Python versions now with `uv python list` (uv even found my old Anaconda 3.9 install from way back)

But how would I switch to 3.13?

LLM hallucinates with `uv venv use 3.13` but that command does not work.

I see from https://docs.astral.sh/uv/concepts/projects/config/#python-v... that one can specify the version in pyproject.toml, but should not there be a command line to switch?

est 10 months ago

if an app or tool has a huge speed advantage, then I'd choose it no matter what.

  • XorNot 10 months ago

    It's the integrated python version management which sold me.

    Took a huge chunk of complexity out of bootstrapping projects which I was otherwise handling myself.

    • mijoharas 10 months ago

      This is the one thing that took a while for me to get to grips with. I tend to use `asdf` for my python version management, which I want to continue to use since I use it for most other languages.

      It'd be nice if we could have a shim for `uv` that allows it to play nice with `asdf` (there is an `asdf-uv` but seems to be about installing different versions of `uv` when instead I want to install different versions of python _via_ uv).

      • sureglymop 10 months ago

        The tool mise-en-place seems to have this. It is a replacement for asdf that I think can be used as a drop in replacement (compatible CLI interface).

        • jdxcode 10 months ago

          Yes you can sync versions python versions bidirectionally with uv and mise. I intend to make further deeper integrations with uv where possible directly into mise core in order to offload python complexity out of mise and into uv where possible.

          • sureglymop 10 months ago

            Thank you for creating mise! My only gripe with it is that it can do so much that I keep having to revisit certain parts of the docs :) but it is a good thing, it's nice if a tool takes care of many related little annoyances. I made a cheat sheet for myself and that helps.

      • adammarples 10 months ago

        I can't see any reason really to keep using asdf for python when a better alternative now exists, unless you just don't want to learn the new syntax?

        • marmarama 10 months ago

          If all you do is write Python then sure, but for the rest of us that have to run code written in 7 different languages within our project, written by 7 different teams, playing nicely with asdf is non-negotiable.

          I've had it with version managers that only target a single language or tool, the cognitive load is too high if there's more than a couple of languages in the mix.

          What would be really nice is an asdf-like single package manager with language-specific plugins. That would save me a bunch of headaches.

globular-toast 10 months ago

Been using Python for 20 years and tried just about every tool related to packaging over the years. The only ones that worked well (IMO) were pip, pip-tools and venv. uv finally replaces all of them.

But being written in Rust means I'm having to also finally get somewhat proficient in Rust. Can any Rust developers comment on the quality of the uv codebase? I find it surprisingly procedural in style, with stuff like hundreds of `if dry_run` type things scattered throughout the codebase, for example. Is this normal Rust style?

brokegrammer 10 months ago

I switched from Poetry to uv last year. I like the speed and how it stores virtual envs in a .venv directory along with the project by default, whereas Poetry store it in a separate directory in your home directory by default, which makes it hard to work with tools that only discover virtual envs in the project root.

uv tool is also a great replacement for pipx.

I think it's the way to go for Python dependency management in 2025.

aosaigh 10 months ago

I’m still using pip, what am I missing?

  • drexlspivey 10 months ago

    1) Deterministic environments 2) not having to manage python installations 3) 100x speed boost

  • OtherShrezzing 10 months ago

    Poetry and uv both offer better dependency management, and project isolation out of the box. If you work in a team, or on more than one python project, then it's worth spending a day to install & adopt either one of these systems.

    • KORraN 10 months ago

      Both dependency management and project isolation are available in a standard Python 3 installation, also out of the box, without 3rd party tool dependency - `pip` and `python3 -m venv`. Admittedly, they work slower, but fast enough for me, especially that I do not run these commands every hour.

      • oblio 9 months ago

        Only if you need a single Python installation and you never install any tools written in Python.

  • stavros 10 months ago
jimmydoe 10 months ago

I don't understand the chart, does it say wagtail suddenly had a lot more uv traffic, but pip and poetry did not drop much? what does that mean? new batch of users emerges using uv? behavior of new uv version disrupted the chart?

zachwill 10 months ago

I’ve been primarily a Python developer since 2012 and recently switched to uv. The ability to manage dependencies, venv, and multiple Python versions makes it best-in-class now. It really is a fantastic tool.

yu3zhou4 10 months ago

Well deserved, I wish more cloud providers have it preinstalled

rob 10 months ago

uv, ruff.... Astral doesn't miss. Excited to see what else they can bring to the Python world.

Had an issue running something on the latest Python version installed (3.13) but only needed to 'downgrade' to 3.11 to run that particular script. Just a simple:

`uv run --python 3.11 --with openai-whisper transcribe.py`

And no need to mess with anything else.

bsdice 10 months ago

Biggest issue I have, is not solving all the dependency hell that is Python with its unversioned libraries, but supply chain attacks. Also regressions introduced by new versions all the time.

That is why for projects I resolve everything by hand, add all coarsely audited 3rd party libraries to ./lib/, and the main entry file then does this:

#!/usr/bin/env -S /bin/sh -c "_top_dir=\"\$(dirname \"\$(realpath -s \"\$0\")\")\"; cd \"\$_top_dir\"; exec \"\$_top_dir/python/install/bin/python3\" -W once::DeprecationWarning -X dev \"\$0\" \"\$@\""

import os

import sys

# Insert ./lib/ in front of search path

sys.path.insert(0, os.path.join(os.path.dirname(__file__), "lib"))

...

I like the excellent standalone CPython by indygreg, now under astral-sh's github organization. Unpack as is into ./python/ and done. Because Arch Linux would just roll forward to whatever version is latest, introducing new warnings every month or two and havoc on any new major version.

Project is fully portable, copy anywhere that has glibc, run.

toenail 10 months ago

path/to/venv/bin/uv pip install foo installs to the global venv, not path/to/venv. Sadly that's a deal breaker for me, everything else looked great.

rubenvanwyk 10 months ago

May uv please keep on eating the python world. It's so good.

user9999999999 10 months ago

is it me or is there a new python dependency manager every year?

  • nikisweeting 10 months ago

    nah other than uv it's just poetry, pdm, and pipenv over last decade, and uv is so dominant I don't think anyone else will try making another one for a while

  • jessekv 10 months ago

    Things were bad and really needed fixing. Several attempts were made.

  • calmoo 10 months ago

    uv is definitely the final one, it's 100x better.

technopol 10 months ago

uv sounds great! For those still using Python v2, how well does it work? pip used to be a pain when having to manage both Python v2 and v3 projects and tools.

  • rglullis 10 months ago

    If you are still on 2.7, packaging is the least of your problems

  • stavros 10 months ago

    Unfortunately, I don't think many things nowadays are tested with a 15-year-old version of a language.

    I was one of the last holdouts, preferring to keep 2.7 support if it wasn't too much hassle, but we have to move on at some point. Fifteen years is long enough support.

Tewboo 10 months ago

It's interesting to see UV downloads surpassing Poetry for Wagtail users. Could be a sign of growing preference for speed and simplicity.

karel-3d 10 months ago

I don't understand a word of the headline, I guess I am not the intended audience.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection