There is a war going on. A war between those that say Python requirements should be explicit and those that say requirements should be implicit. Before I continue, I’m going to be talking about requirements.txt
, not setup.py
. The difference between explicit and implicit requirements comes down to whether the line says Django==1.9.7
or Django
, respectively. Going deeper, you could also say that adding dependencies of dependencies is explicit, and you could loosely pin like Django<1.10.
The advantage of explicit requirements is you get a repeatable environment. Especially if you’re also specifying dependencies of dependencies. The advantages of implicit requirements are readability and automatic security upgrades.
Here at TabbedOut, we’ve developed a technique that works very well I’d like to share: Use pip-tools
to manage your requirements. You get the best of both worlds, at the expense of some extra boilerplate. Here’s how we do it:
- Be in a virtualenv
- Use our Makefile boilerplate (see below)
pip install pip-tools
- Write a “sloppy” requirements.txt using implicit requirements, but name it
requirements.in
- Run
make requirements.txt
- Check all this into your codebase
Advantages
requirements.in
is easy to maintainrequirements.txt
has pinned versions so your virtualenv matches your collaborators and production- You automatically get patches and security fixes when you run
make requirements.txt
, and there are no surprises because it goes through your code review process
Tips
- Try to loosely pin requirements in your
requirements.in
. Though it doesn’t matter that much because you’ll catch it when you see a major version change inrequirements.txt
. - Specifying an exact version in
requirements.in
is an anti-pattern, and you should document why. Often it’s because there’s a bug or backwards-incompatible change.
Makefile boilerplate
Here’s what a Makefile might contain:
help: ## Shows this help
@echo "$$(grep -h '#\{2\}' $(MAKEFILE_LIST) | sed 's/: #\{2\} / /' | column -t -s ' ')"
install: ## Install requirements
@[ -n "${VIRTUAL_ENV}" ] || (echo "ERROR: This should be run from a virtualenv" && exit 1)
pip install -r requirements.txt
.PHONY: requirements.txt
requirements.txt: ## Regenerate requirements.txt
pip-compile --upgrade --output-file $@ requirements.in
- help: This is just a fast way of making your Makefile self-documenting.
- install: Nowadays, you need Python and non-Python requirements. Putting it all in one make target makes it easier for developers to jump into a project.
- PHONY: When you run
make requirements.txt
, you want it to run every time. Not just whenrequirements.in
changes. That’s because new versions may have been uploaded to PyPI. I always group my PHONY with my target. Even though it adds more lines, your Makefile will be more maintainable because you’re not trying to keep a list off the screen up to date. - requirements.txt: Why
make requirements.txt
overmake requirements
? Because best practice dictates that if the output of a make target is a file, that file should also be the name of the target. That way, you can use the automatic variable$@
and it’s explicit, even at the cost of needing the PHONY. - –upgrade: Without this,
pip-tools
doesn’t actually upgrade your dependencies. - –output-file $@:
pip-tools
does this by default, but explicit is better than implicit. I would prefer to dopip-compile --upgrade requirements.in > $@
butpip-tools
1.6 does a poor job of dealing with stdout (see below).
Caveats
- When you change
requirements.in
, you do have to remember to runmake requirements
, but you could automate that with a git-hook or CI process. In practice, we’ve found that runningmake requirements.txt
is fine. pip-tools==1.6
does not work with the latest pip (8.1.2). See #358pip-tools==1.6
has a poor understanding of how stdin and stdout are supposed to work. Hopefully this gets fixed soon but is only a minor annoyance. #362 #360 #353 #104- The compilation step can depend on your platform. I’ve only noticed this with
ipython
, which needs packages for interacting with the terminal likegnureadline
. It hasn’t been trouble for us, but it could be for you. A workaround is to run the process in a Docker container.
Sample Scenarios
If you need more convincing, here are some problems this approach solves for us:
I thought I was getting xyzpackage version 3, why is version 2 getting installed? Pip tools flattens all your requirements, and annotates which package specified what. So in requirements.txt
, you’ll see xyzpackage==2.4Â Â Â # via scumbagpackage
and see that scumbagpackage
was responsible.
What packages am I actually using? In a large project, your requirements.txt
will balloon as you run into bugs and start pinning dependencies of dependencies. Then one day, you’ll realize you don’t know what packages you’re actually using. With a much simpler requirements.in
, there’s less to sort through and fully pinned packages stick out like sore thumbs.
It works for me Sometimes a project will work only for you. You check your installed versions against requirements.txt
and they match. But what you didn’t realize is a dependency of a dependency broke something. Since pip-tools
freezes everything, you’ll have the same version of every package installed. And if something does break, you’ll have history to trace down what changed.