Python -m pip vs pipx install vs conda install & python environments

previously we had a conversation around always using python -m pip to call pip. My takeaway from that conversation was that python -m pip ensures that you are using a version of pip associated with your current, active environment. But there is also pipx that could be used to install some things like black, pytest, sphinx??, etc “globally”

my brain is struggling a bit with using pipx as it may or may not relate to calling python -m pip install. And also how pipx would interact with a conda environment (or not).

if my question isn’t clear please let me know. it’s likely because pipx is straining my mental model of how pip / venv vs conda environments work - particularly in relation to conda which is of course NOT the same as pip but is pipx related to pip?

yes my brain is exploding a bit here.

Here’s my understanding:

  1. pipx: Used if you are installing one package/application (similar to brew on macOS): pipx install PACKAGE (do not need python -m)
  2. Global installations: In general, I would discourage that practice. It muddies the water when determining if something is or is not a requirement for a virtual environment or conda environment. Personally, I add these libraries (pytest, sphinx, black, etc.) as needed to my requirements.txt or environment.yml files.
  3. python -m pip install ... is a reasonable practice though often unnecessary when inside a virtual environment. My typical workflow when using venv and pip:
python -m venv name_of_environment  # I typically use .venv as the name
source name_of_environment/bin/activate
pip install -U pip  # upgrade pip (you could also use python -m pip install -U pip)
pip install -r requirements.txt  # install my dependency packages or install pyproject.toml listed dependencies
2 Likes

This is super helpful, Carol. And also i was going to ask whether venv is the preferred envt too given it’s build into python.

i am a long time conda user.

so to followup i think you are suggesting that we can keep pipx out of the equation for our tutorials.

i was planning on introducing both conda and venv as envt options.

So my next question is -

if someone is a conda user (i use conda for everything!). How would that workflow fit into a packaging build. if you use a requirements.txt file OR a pyproject.toml file calling extras like

python -m pip install .[dev]
or
python -m pip install -r requirements.txt

Would you also provide a conda .yml environment for a user to install a package-name-dev environment. BUT then pip install packages?

so the workflow would look like this for a conda user:

conda env create -f environment.yml
conda activate env_dev
pip install -U pip
pip install .[dev] # from toml extras or you could use requirements.txt

i hope my question makes sense. i’m trying to sort out what a conda user would do vs a venv user. as mixing pip and conda as i’m doing above also feels a bit dangerous

+1 to all that Carol said. I’d like to add that, for conda users, the similar approach I use is:

conda env create --name MYENV --file requirements.txt --file requirements-dev.txt some_non_python_package
conda activate MYENV
pip install -e . --no-deps --force-reinstall

We don’t need the pip upgrade part b/c a fresh env will always pull the latest one and, if the extra development packages and/or the dependencies aren’t listed in a requirement text file, one can let pip do all the work from there in the last line by removing --no-deps and adding any extras using the standard Python syntax like You did above. The only reason to use conda in that scenario, IMO, would be:

  1. Create an environment that require non-Python packages to build
  2. Manage different Python versions a bit easier

A good example of that is the build_latest.yml vs miniconda.yml. Both are tests here for the sake of coverage but if you only have enough resources to maintain one option which one would it be? IMO the conda one b/c it is much easier to specify the dependencies and build the environment without having to think which Linux distro am I? Brew or system? Etc.

However, not that a specific copy-n-paste incantation that covers all cases is not possible. It depends on how the project is structured, if there are non-python dependencies or not, etc. Hopefully pyproject.toml with external dependency declaration will save us from this one day.

2 Likes

thank you @ocefpaf !!

so in this case -

  1. mixing pip and conda is ok.

AND
2. you’re using --force-reinstall - is this just in case the user already did a pip install . or a pip install packagename ? so it makes sure it’s installed in editable mode even if it’s already installed?

do i have that correct?

right now i’m working on a pure python package tutorial so we don’t need to worry about too much crazy project structure (i think??) BUT i wanted to show users a venv and a conda example so they can pick. some may be like me always using conda. others may be venv users since it comes out of the box w python!

those examples are helpful - many thanks!

If you are using conda just to install Python itself and non-python deps that you need in the build, yes, mixing conda and pip is fine. There are other scenarios where it is OK too but then things can get more complicated.

And yes, the force reinstall is b/c I have existing envs locally and it saves me some trouble. That is not needed in a tutorial or in CIs. I copied and pasted that without thinking.

2 Likes

From my perspective, I would say that “yes” venv is preferred but there are folks that still choose to use virtualenv but I would venture more so in web dev than science.

so to followup i think you are suggesting that we can keep pipx out of the equation for our tutorials.

I think it is fine to mention. However I view pipx more as an installation tool than a packaging tool. Personally, I find it most useful for installing 1 pure Python application.

if someone is a conda user (i use conda for everything!). How would that workflow fit into a packaging build. if you use a requirements.txt file OR a pyproject.toml file calling extras like

python -m pip install .[dev]
or
python -m pip install -r requirements.txt

Would you also provide a conda .yml environment for a user to install a package-name-dev environment. BUT then pip install packages?

My personal workflow would be to list all the packages that I wish to install in the environment.yml file and list the pip-installable packages in the environment.yml file.

If I was doing iterative development, one could pip install or conda install additional packages into a conda environment. Personally, I deactivate my conda environment and edit the environment.yml and then activate the revised conda environment. (I’m a creature of habit, and lean toward “no surprises” in my workflows.)

1 Like

ok thank you @willingc @ocefpaf i really appreciate this. i’m trying to think about the lightest weight packaging workflow solution and i think it would be good to accomodate both conda and venv (that seems like the way to go!) solutions.