Linux – How to make python modules available to all users

linuxpippython

I work in a multi-user setting and am relatively new to Python. The machines in question run Ubuntu 16.04, and we are using Python 2.7. I personally have installed several additional modules, such as tensorflow, keras, and some other related modules and I believe I used the pip installer (pip 18.0 from /usr/local/lib/python2.7/) to install these (sudo pip install X).

I've been successfully running python scripts for months using all of these modules but another user has been unable to run any python code that uses any of the modules that I've installed. Even a 1-line script that exists only to import a module such as:

import tensorflow

fails to run, generating an ImportError (I'll only paste the last couple of lines):

File "/usr/local/lib/python2.7/dist- 
 packages/tensorflow/python/ops/variable_scope.py", line 24, in <module>
import enum  # pylint: disable=g-bad-import-order
ImportError: No module named enum

What did I do wrong that's making these modules invisible to other users, and how can I fix it?

Best Answer

I strongly recommend avoid using python-pip on production context. Modules installed that way aren't updated during system updates which may lead to vulnerabilities that never get patched ....

Nevertheless, pip, by default, install the module only in calling user's $HOME. For pip to install "system-wide", use the --system switch.

For more details, see pip install --help output

EDIT --system looks to be a debian specific option.

binarym@avalon:/tmp/python-pip-18.1/debian/patches$ grep -- '--system' *
set_user_default.patch:+:ref:`--system <install_--system>` option to ``pip install``.
set_user_default.patch:+            '--system',

Thanks @co2f2e for your comment.

Related Question