Pipx – Install and Run Python Applications in Isolated Environments
pipxproject.github.ioI wonder what I need all these tools like Pipenv poetry pyenv and whatever other abstraction there is when I can use Docker to perfectly isolate all dependencies including OS packages?
Many of those tools have important functions other than dependency isolation. Pyenv is perhaps the only tool mentioned that can be replaced by docker. However, pyenv has a marginal usability advantage over docker. You can set a python version or virtualenv to activate automatically in a project folder - no need to start a container.
Poetry is primarily a packager. Pipx installs python scripts like gita, poetry, youtube-dl etc as user commands. They cannot be replaced by docker.
Finally, I never had a good development experience inside a docker container. Pyenv, pipx and poetry play much nicer with the system development tools and give a better development experience.
If you want to easily read or write files anywhere, or if you want to edit/develop those scripts easier.
Not everything is a web-accessible service, and sometimes you do care about short startup time -- for example, if you have a file conversion script that you need to run on 1000's of files.
You can kinda-sort fake it with "docker exec", but it is really awkward, and at some point, pyenv etc.. is just easier.
1. Run on 1000, or 100000 files: `docker run -it --entrypoint=bash -v ${PWD}:tmp python:3.7`
You can simply mount your current directory to docker.
2. Docker doesn't ONLY support web-accessible services:
For example, you can run aws cli without python with this alias.
aws() { docker-run --entrypoint="aws" infrastructureascode/aws-cli:1.16.309 "$@" }
3. When you need to debug python code live with your IDE: Check these out https://code.visualstudio.com/docs/containers/debug-common https://www.jetbrains.com/help/pycharm/using-docker-as-a-rem...
For 1, this runs a new docker container for each file. This is like 0.3 seconds on my PC for example - or 5 minutes for 1000 files. And only current directory is not enough, you may want to “convert input.foo /sev/webroot/output.jpg”.
So you need a better script. The one which will keep container running and use “docker exec”, and rewrite command lines for absolute paths. The one which will relaunch it as needed to map more dirs, and will shut it down eventually. Once you write it, you’ll likely end up with something way harder than just setting pyenv.
2. Yes, I know. See above. This command will be super annoying, as things like “aws s3 cp” would not work and it would not even see your authorization. So you need to make it longer and more complex. And then it will be harder than just a venv install.
3. Neat! It does look workable if your dependencies are very complex. In most cases, however, I’d say venv is still nicer, as you can use pydoc3, graphics, don’t have to worry about mapping input/output files, can use related command line tools and so on.
Does this work with pyenv?
It does work with pyenv, but can be slightly tricky. I used a setup recommended by Jacob Kaplan [1] that uses pyenv, pipx and poetry.
Pyenv is used to manage different versions of Python and virtual environments (using pyenv-virtualenv [2]). Pipx will be accessible only from the python version or virtual environment it is installed to. However, the programs that you install using pipx (like gita or poetry) have their own virtual environments, and run reliably irrespective of accessibility of pipx and what you have activated using pyenv.
It's easier to install pipx into the pyenv python version that you have set up as global. This ensures that you have access to pipx most of the time. You could just spawn a temporary shell with global python to access pipx otherwise.
[1] https://jacobian.org/2019/nov/11/python-environment-2020/ [2] https://github.com/pyenv/pyenv-virtualenv
What happens to your pipx installed software when the global Python gets updated? (This is my constant headache with Python tooling.)