Safely managing system/local/project Python packages?
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
The context
I'm currently running a Gentoo desktop and want to move towards doing proper Python development on my system.
I have tried to install some packages, namely docker-compose
, through the Gentoo's emerge system. Due to docker-compose
's reliance on an older version of the requests
module, my system updates became hairy and difficult to resolve.
I then switched to pip
and followed a recommendation to use the --user
flag for pip
installs to avoid conflicts with the system-wide tools/libraries. I installed docker-compose
at this user level and all was well until I added Pipenv to the mix for managing dependencies for individual projects. Pipenv doesn't like the --user
flag and when set in the pip.conf
file will refuse to install dependencies into its virtualenv
.
getting to the point...
I would like a way to isolate the different levels of Python packages I need on my system. I would like to leave the system dependencies in place as they have been set by the Gentoo emerge
tool, have a place to install tools and libraries that I want my primary user to have access to, and finally somewhere that I can store libraries that I need only for a specific project. As my machine has only a single user, I'm not worried so much about sharing the "user" dependencies between users, but ensuring that I can access them from any directory.
I have looked into virtualenv
s, however it seems that that works better for a "per-project" level rather than a "user" level. From what I can tell, only one virtualenv
can be active at a time, therefore if I install a tool like docker-compose
in an always-active virtualenv
from my home dir, then switch to a new virtualenv
for a project, I will lose access to the docker-compose
tool and anything else I have installed into the "user" level virtualenv
.
In short, what options do I have to manage different "levels" of Python dependencies? What are the best practices to ensuring that I don't tread on my system dependencies from a user level and my user dependencies from a project level? What do you guys do to safely isolate these different levels, if you do it at all?
This is my first stackexchange post, so please let me know if I can clarify or improve this question. I have tried to do research beforehand, but feel as though I have come up short.
python gentoo virtualenv
add a comment |Â
up vote
2
down vote
favorite
The context
I'm currently running a Gentoo desktop and want to move towards doing proper Python development on my system.
I have tried to install some packages, namely docker-compose
, through the Gentoo's emerge system. Due to docker-compose
's reliance on an older version of the requests
module, my system updates became hairy and difficult to resolve.
I then switched to pip
and followed a recommendation to use the --user
flag for pip
installs to avoid conflicts with the system-wide tools/libraries. I installed docker-compose
at this user level and all was well until I added Pipenv to the mix for managing dependencies for individual projects. Pipenv doesn't like the --user
flag and when set in the pip.conf
file will refuse to install dependencies into its virtualenv
.
getting to the point...
I would like a way to isolate the different levels of Python packages I need on my system. I would like to leave the system dependencies in place as they have been set by the Gentoo emerge
tool, have a place to install tools and libraries that I want my primary user to have access to, and finally somewhere that I can store libraries that I need only for a specific project. As my machine has only a single user, I'm not worried so much about sharing the "user" dependencies between users, but ensuring that I can access them from any directory.
I have looked into virtualenv
s, however it seems that that works better for a "per-project" level rather than a "user" level. From what I can tell, only one virtualenv
can be active at a time, therefore if I install a tool like docker-compose
in an always-active virtualenv
from my home dir, then switch to a new virtualenv
for a project, I will lose access to the docker-compose
tool and anything else I have installed into the "user" level virtualenv
.
In short, what options do I have to manage different "levels" of Python dependencies? What are the best practices to ensuring that I don't tread on my system dependencies from a user level and my user dependencies from a project level? What do you guys do to safely isolate these different levels, if you do it at all?
This is my first stackexchange post, so please let me know if I can clarify or improve this question. I have tried to do research beforehand, but feel as though I have come up short.
python gentoo virtualenv
Something likechroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.
â Mioriin
Aug 10 at 5:02
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
The context
I'm currently running a Gentoo desktop and want to move towards doing proper Python development on my system.
I have tried to install some packages, namely docker-compose
, through the Gentoo's emerge system. Due to docker-compose
's reliance on an older version of the requests
module, my system updates became hairy and difficult to resolve.
I then switched to pip
and followed a recommendation to use the --user
flag for pip
installs to avoid conflicts with the system-wide tools/libraries. I installed docker-compose
at this user level and all was well until I added Pipenv to the mix for managing dependencies for individual projects. Pipenv doesn't like the --user
flag and when set in the pip.conf
file will refuse to install dependencies into its virtualenv
.
getting to the point...
I would like a way to isolate the different levels of Python packages I need on my system. I would like to leave the system dependencies in place as they have been set by the Gentoo emerge
tool, have a place to install tools and libraries that I want my primary user to have access to, and finally somewhere that I can store libraries that I need only for a specific project. As my machine has only a single user, I'm not worried so much about sharing the "user" dependencies between users, but ensuring that I can access them from any directory.
I have looked into virtualenv
s, however it seems that that works better for a "per-project" level rather than a "user" level. From what I can tell, only one virtualenv
can be active at a time, therefore if I install a tool like docker-compose
in an always-active virtualenv
from my home dir, then switch to a new virtualenv
for a project, I will lose access to the docker-compose
tool and anything else I have installed into the "user" level virtualenv
.
In short, what options do I have to manage different "levels" of Python dependencies? What are the best practices to ensuring that I don't tread on my system dependencies from a user level and my user dependencies from a project level? What do you guys do to safely isolate these different levels, if you do it at all?
This is my first stackexchange post, so please let me know if I can clarify or improve this question. I have tried to do research beforehand, but feel as though I have come up short.
python gentoo virtualenv
The context
I'm currently running a Gentoo desktop and want to move towards doing proper Python development on my system.
I have tried to install some packages, namely docker-compose
, through the Gentoo's emerge system. Due to docker-compose
's reliance on an older version of the requests
module, my system updates became hairy and difficult to resolve.
I then switched to pip
and followed a recommendation to use the --user
flag for pip
installs to avoid conflicts with the system-wide tools/libraries. I installed docker-compose
at this user level and all was well until I added Pipenv to the mix for managing dependencies for individual projects. Pipenv doesn't like the --user
flag and when set in the pip.conf
file will refuse to install dependencies into its virtualenv
.
getting to the point...
I would like a way to isolate the different levels of Python packages I need on my system. I would like to leave the system dependencies in place as they have been set by the Gentoo emerge
tool, have a place to install tools and libraries that I want my primary user to have access to, and finally somewhere that I can store libraries that I need only for a specific project. As my machine has only a single user, I'm not worried so much about sharing the "user" dependencies between users, but ensuring that I can access them from any directory.
I have looked into virtualenv
s, however it seems that that works better for a "per-project" level rather than a "user" level. From what I can tell, only one virtualenv
can be active at a time, therefore if I install a tool like docker-compose
in an always-active virtualenv
from my home dir, then switch to a new virtualenv
for a project, I will lose access to the docker-compose
tool and anything else I have installed into the "user" level virtualenv
.
In short, what options do I have to manage different "levels" of Python dependencies? What are the best practices to ensuring that I don't tread on my system dependencies from a user level and my user dependencies from a project level? What do you guys do to safely isolate these different levels, if you do it at all?
This is my first stackexchange post, so please let me know if I can clarify or improve this question. I have tried to do research beforehand, but feel as though I have come up short.
python gentoo virtualenv
python gentoo virtualenv
edited Aug 8 at 3:29
slmâ¦
238k65491662
238k65491662
asked Aug 7 at 21:17
Matt Mazzanti
111
111
Something likechroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.
â Mioriin
Aug 10 at 5:02
add a comment |Â
Something likechroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.
â Mioriin
Aug 10 at 5:02
Something like
chroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.â Mioriin
Aug 10 at 5:02
Something like
chroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.â Mioriin
Aug 10 at 5:02
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f461173%2fsafely-managing-system-local-project-python-packages%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Something like
chroot
, namespace containers (systemd-nspawn
) and/or virtual machines (KVM/qemu) could work, depending on the level of separation you need from the system OS.â Mioriin
Aug 10 at 5:02