I think it's quite problematic on #Linux to start off with regular ol' #Python that came with the system (i.e.
/usr/bin/python
), and then as you go installing some packages (i.e. on the #AUR if you're on #ArchLinux) which will then install some Python libraries using it, and then you start using something like #Conda or #Miniconda whereby subsequent package installations or updates may be installing these libraries on the Conda environment (i.e. ~/miniconda3/python
) and not the system, so there's some overlap there or so? I'm wondering what's the best way of moving forward from this point - esp since sometime ago, it's no longer possible to raw pip install <package>
anymore for wtv reason.
Bhavani Shankar 💭
in reply to Mika • • •Unus Nemo
in reply to Mika • •@Mika Your best bet is to use python3 -m venv. To setup a default virtual environment that you use on your system for scripting your system with:
Invoking
... show morecd
by itself with no arguments will place you in your home directory. The --prompt argument can be whatever you want displayed on your shell prompt to remind you what venv you are currently using. The .default argument is the directory where your venv will be created. The --system-site-packages argument makes your system installed packages available in the venv. The --upgrade-deps argument is optional and is only required the first time you create a venv. It is used to upgrade pip and the main dependencies in the venv when creating it. If you do not use this option you will be notified that pip is out of date and that you should upgrade it which you could do at that time. Using thi@Mika Your best bet is to use python3 -m venv. To setup a default virtual environment that you use on your system for scripting your system with:
Invoking
cd
by itself with no arguments will place you in your home directory. The --prompt argument can be whatever you want displayed on your shell prompt to remind you what venv you are currently using. The .default argument is the directory where your venv will be created. The --system-site-packages argument makes your system installed packages available in the venv. The --upgrade-deps argument is optional and is only required the first time you create a venv. It is used to upgrade pip and the main dependencies in the venv when creating it. If you do not use this option you will be notified that pip is out of date and that you should upgrade it which you could do at that time. Using this argument just saves a step.Then you have to update your .bashrc or .bash_profile to load this venv when you login. When working on a project you should create a venv for that project along with a requirements.txt file for the project. The venv should not be included in your git repo (add an ignore for it).
When the system's python upgrades you will need to:
python -m venv --upgrade --prompt Default .default
Using this approach you will be able to use both system python libraries and pip with no conflict. It will also avoid adding a lot of math and science libraries that you may not actually require. I typically use a Conda env in specific projects were I need the science and math libraries. I do not require them for system scripts so I do not have them in my default venv.
For more information on python -m venv:
Python Venv
like this
Mika, Zulleyy3, LorenAmelang and ZeStig like this.
Mika reshared this.
Mika
in reply to Unus Nemo • • •Unus Nemo likes this.
Unus Nemo
in reply to Mika • •@Mika What this really does is allow you to install either from your OS's repo or pip without complications. The caveat is that this has to be done on a per user basis. What you install via pip will not be installed for other users of the system (if you have multiple users) but some would see this as a plus. You install everything that should be for every user (including system users such as apache, etc (though a django (or other python framework) project should have its own venv)) via your OS's repo and those that only you require via pip.
Sometimes you want to install a utility you see on github. It says to use pip. Then you try to install it globally and pip informs you that this would break your system so refuses? This works around that issue. Because it keeps all of your pip installs in your venv and not system wide. So it will not break you system wide installation by requiring different versions of a dependency.
As a rule of thumb, if I can install it via my
... show more@Mika What this really does is allow you to install either from your OS's repo or pip without complications. The caveat is that this has to be done on a per user basis. What you install via pip will not be installed for other users of the system (if you have multiple users) but some would see this as a plus. You install everything that should be for every user (including system users such as apache, etc (though a django (or other python framework) project should have its own venv)) via your OS's repo and those that only you require via pip.
Sometimes you want to install a utility you see on github. It says to use pip. Then you try to install it globally and pip informs you that this would break your system so refuses? This works around that issue. Because it keeps all of your pip installs in your venv and not system wide. So it will not break you system wide installation by requiring different versions of a dependency.
As a rule of thumb, if I can install it via my system's repo, I do. If it is only available via pip, as will be the case on a lot of modules that do not get offered on a distro's repo, then I use pip.
Mika likes this.