Skip to main content


I think it's quite problematic on #Linux to start off with regular ol' #Python that came with the system (i.e. /usr/bin/python), and then as you go installing some packages (i.e. on the #AUR if you're on #ArchLinux) which will then install some Python libraries using it, and then you start using something like #Conda or #Miniconda whereby subsequent package installations or updates may be installing these libraries on the Conda environment (i.e. ~/miniconda3/python) and not the system, so there's some overlap there or so? I'm wondering what's the best way of moving forward from this point - esp since sometime ago, it's no longer possible to raw pip install <package> anymore for wtv reason.
in reply to Mika

You can manage all python dependencies with uv in a reproducible and project specific way. Then there's no need to use conda or aur.
in reply to Mika

@Mika

Mika reshared this.

in reply to Unus Nemo

wow, this is incredibly detailed and has taught me a lot. I was aware ofc of venv and do use it on a per-project basis, but I wasn’t aware that it could be used this way. I guess, as long as what’s been installed on the system (python library wise) is found on the (new) environment, all shd be good and I could remove all python/pip libraries installed on the system?
in reply to Mika

@Mika