开发者

Best practices for Python deployment -- multiple versions, standard install locations, packaging tools etc

Many posts on different aspects of this question but I haven't seen a post that brings it all together.

First a subjective statement: it seems like the simplicity we experience when working with the Python language is shot to pieces when we move outside the interpreter and start grappling with deployment issues. How best to have multiple versions of Python on the same machine? Where should packages be installed? Disutils vs. setuptools vs. pip etc. It seems like the Zen of Python is being abused pretty badly when it comes to deployment. I'm feeling eerie echoes of the "DLL hell" experience on Windows.

Do the experts agree on some degree of best practice on these questions?

Do you run multiple versions of Python on the same machine? How do you remain confident that they can co-exist -- and the newer version doesn't break assumptions of other processes that rely on the earlier version (scripts provided by OS vendor, for example)? Is this safe? Does virtualenv suffice?

What are the best choices for locations for different components of the Python environment (including 3rd party packages) on the local file system? Is there a strict or rough correspondence between locations for many different versions of Unixy and Windows OS's that can be relied upon?

And the murkiest corner of the swamp -- what install tools do you use (setuptools, distutils, pip etc.) and do they play well with your choices re: file locations, Python virtual environ开发者_JAVA技巧ments, Python path etc.

These sound like hard questions. I'm hopeful the experienced Pythonistas may have defined a canonical approach (or two) to these challenges. Any approach that "hangs together" as a system that can be used with confidence (feeling less like separate, unrelated tools) would be very helpful.


I've found that virtualenv is the only reliable way to configure and maintain multiple environments on the same machine. It even has as a way of packaging up environment and installing it on another machine.

For package management I always use pip since it works so nicely with virtualenv. It also makes it easy to install and upgrade packages from a variety of sources such a git repositories.


I agree this is quite a broad question, but I’ll try to address its many parts anyway.

About your subjective statement: I don’t see why the simplicity and elegance of Python would imply that packaging and deployment matters suddenly should become simple things. Some things related to packaging are simple, other are not, other could be. It would be best for users if we had one complete, robust and easy packaging system, but it hasn’t turned that way. distutils was created and then its development paused, setuptools was created and added new solutions and new problems, distribute was forked from setuptools because of social problems, and finally distutils2 was created to make one official complete library. (More on Differences between distribute, distutils, setuptools and distutils2?) The situation is far from ideal for developers and users, but we are working on making it better.

How best to have multiple versions of Python on the same machine? Use your package manager if you’re on a modern OS, or use “make altinstall” if you compile from source on UNIX, or use the similar non-conflicting installation scheme if you compile from source on Windows. As a Debian user, I know that I can call the individual versions by using “pythonX.Y”, and that what the default versions (“python” and “python3”) are is decided by the Debian developers. A few OSes have started to break the assumption that python == python2, so there is a PEP in progress to bless or condemn that: http://www.python.org/dev/peps/pep-0394/ Windows seems to lack a way to use one Python version as default, so there’s another PEP: http://www.python.org/dev/peps/pep-0397/

Where should packages be installed? Using distutils, I can install projects into my user site-packages directory (see PEP 370 or docs.python.org). What exactly is the question?

Parallel installation of different versions of the same project is not supported. It would need a PEP to discuss the changes to the import system and the packaging tools. Before someone starts that discussion, using virtualenv or buildout works well enough.

I don’t understand the question about the location of “components of the Python environment”.

I mostly use system packages (i.e. using the Aptitude package manager on Debian). To try out projects, I clone their repository. If I need something that’s not available with Aptitude, I install (or put a .pth file to the repo) in my user site-packages directory. I don’t need a custom PYTHONPATH, but I have changed the location of my user site-packages with PYTHONUSERBASE. I don’t like the magic and the eggs concept in setuptools/distribute, so I don’t use them. I’ve started to use virtualenv and pip for one project though (they use setuptools under the cover, but I made a private install so my global Python does not have setuptools).


One resource for this area is the book Expert Python Programming by Tarek Ziade. I'm ambivalent about the quality of the book, but the topics covered are just what you're focusing on.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜