开发者

When to use pip requirements file versus install_requires in setup.py?

I'm using pip with virtualenv to package and install some Python libraries.

I'd imagine what I'm doing is a pretty common scenario. I'm the maintainer on several libraries for which I can specify the dependencies explicitly. A few of my libraries are dependent on third party libraries that have transitive dependencies over which I have no control.

What I'm trying to achieve is for a pip install on one of my libraries to download/install all of its upstream dependencies. What I'm struggling with in the pip documentation is if/how requirements files can do this on their own or if they're really just a supplement to using install_requires.

Would I use install_requires in all of my libraries to specify dependencies and version ranges and then only use a requirements file to resolve a conflict and/or freeze them for a production build?

Let's pretend I live in an imaginary world (I know, I know) and my upstream dependencies are straightforward and guaranteed to never conflict or break backward compatibility. Would I be compelled to use a pip requirements file at all or just let pip/setuptools/distribute install everything based on install_requires?

There are a lot of similar questions on here, but I couldn't find any that were as basic as whe开发者_运维百科n to use one or the other or using them both together harmoniously.


My philosophy is that install_requires should indicate a minimum of what you need. It might include version requirements if you know that some versions will not work; but it shouldn't have version requirements where you aren't sure (e.g., you aren't sure if a future release of a dependency will break your library or not).

Requirements files on the other hand should indicate what you know does work, and may include optional dependencies that you recommend. For example you might use SQLAlchemy but suggest MySQL, and so put MySQLdb in the requirements file).

So, in summary: install_requires is to keep people away from things that you know don't work, while requirements files to lead people towards things you know do work. One reason for this is that install_requires requirements are always checked, and cannot be disabled without actually changing the package metadata. So you can't easily try a new combination. Requirements files are only checked at install time.


here's what I put in my setup.py:

# this grabs the requirements from requirements.txt
REQUIREMENTS = [i.strip() for i in open("requirements.txt").readlines()]

setup(
    .....
    install_requires=REQUIREMENTS
)


The Python Packaging User Guide has a page about this topic, I highly recommend you read it:

  • install_requires vs Requirements files

Summary:

install_requires is there to list the dependencies of the package that absolutely must be installed for the package to work. It is not meant to pin the dependencies to specific versions, but ranges are accepted, for example install_requires=['django>=1.8']. install_requires is observed by pip install name-on-pypi and other tools.

requirements.txt is just a text file, that you can choose to run pip install -r requirements.txt against. It's meant to have versions of all dependencies and subdependencies pinned, like this: django==1.8.1. You can create one using pip freeze > requirements.txt. (Some services, like Heroku, automatically run pip install -r requirements.txt for you.) pip install name-on-pypi does not look at requirements.txt, only at install_requires.


I only ever use a setup.py and install_requires because there is only one place to look at. It is just as powerful as having a requirements file and there is no duplication to maintain.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜