Continuous delivery of a Django app from Travis CI to PythonAnywhere

Reggae CDMX is a calendar for all reggae and dub related events in Mexico City. I am currently working on a rewrite as a Django web app, and the site is not available. But what is already working is the continuous delivery pipeline from GitHub via Travis CI to PythonAnywhere.

(This is no introduction to Travis CI, PythonAnywhere nor Git.)

Here is how I set it up:

1. Deploy Django project

PythonAnywhere's guide for Deploying an existing Django project on PythonAnywhere explains everything to manually set up the web app.

For reference, the Reggae CDMX code is checked out to

/var/www/sites/reggae-cdmx.com

and the virtual environment lives at

~/.virtualenvs/reggae-cdmx/

2. Prepare Git push deployment

PythonAnywhere has a comprehensive guide to set up Git push deployments.

My bare repository is located at

~/bare-repos/reggae-cdmx.git

The post-receive hook looks like this:

# ~/bare-repos/reggae-cdmx.git/hooks/post-receive
#!/bin/bash

BASE_DIR=/var/www/sites/reggae-cdmx.com
PYTHON=$HOME/.virtualenvs/reggae-cdmx.com/bin/python
PIP=$HOME/.virtualenvs/reggae-cdmx.com/bin/pip
MANAGE=$BASE_DIR/manage.py

echo "=== configure Django ==="
export DJANGO_SETTINGS_MODULE=config.settings.production

echo "=== create base directory ==="
mkdir -p $BASE_DIR

echo "=== checkout new code ==="
GIT_WORK_TREE=$BASE_DIR git checkout -f

echo "=== install dependencies in virtual environment ==="
$PIP install -q --upgrade -r $BASE_DIR/requirements/production.txt

echo "=== compress and collect static files ==="
$PYTHON $MANAGE compress --force
$PYTHON $MANAGE collectstatic --no-input

echo "=== update database ==="
$PYTHON $MANAGE migrate --no-input

3. Custom deployment with Travis CI

I set up the repository in Travis CI for automatic builds on pull requests and branch changes. In order to deploy to PythonAnywhere, I use Travis's Custom deployment.

All Travis related files live in the .travis subdirectory of the Django project. This is of course completely arbitrary.

~ $ cd ~/code/reggae-cdmx/
reggae-cdmx $ mkdir .travis
reggae-cdmx $ cd .travis

Create SSH keys

git push uses SSH, so I need a pair of SSH keys.

.travis $ ssh-keygen -t rsa -b 4096 -C 'hallo@example.com' -f deploy_key

Copy the public key to the PythonAnywhere account (see PythonAnywhere: SSH access).

.travis $ ssh-copy-id -i deploy_key flowfx@ssh.pythonanywhere.com

Encrypt SSH key and add it to the repository

Travis offers a toool to encrypt files that allows to add the SSH private key to the Git repository. See Encrypting files for a complete how-to.

First, I encrypt the deploy key,

.travis $ travis login
.travis $ travis encrypt-file deploy_key --add

then add it to the Git repository.

.travis $ git add deploy_key.enc

Last, I make sure the decrypted key is never pushed to the public GitHub repository:

reggae-cdmx $ echo 'deploy_key' >> .gitignore

Configure Travis CI

A simplified .travis.yml configuration file (here the one used for Reggae CDMX) looks like this. The before_install part is added automatically by the travis encrypt-file deploy_key --add command. The ssh_known_hosts line is also required for push deployment with Git/SSH.

Hopefully, the rest is documented sufficiently by the comments.

# .travis.yml
language: python
cache: pip
python:
- 3.6
addons:
  # add PythonAnywhere server to known hosts
  ssh_known_hosts: ssh.pythonanywhere.com
before_install:
  # decrypt ssh private key
  - openssl aes-256-cbc -K $encrypted_xxxxxxxxxxxx_key -iv $encrypted_xxxxxxxxxxxx_iv -in .travis/deploy_key.enc -out deploy_key -d
install: pip install -r requirements/testing.txt
script:
  # run test suite
  - pytest --cov
after_success:
  # start ssh agent and add private key
  - eval "$(ssh-agent -s)"
  - chmod 600 deploy_key
  - ssh-add deploy_key
  # configure remote repository
  - git remote add pythonanywhere flowfx@ssh.pythonanywhere.com:/home/flowfx/bare-repos/reggae-cdmx.git
  # push master branch to production 
  - git push -f pythonanywhere master
  # reload PythonAnywhere web app via the API
  - python .travis/reload-webapp.py
after_deploy:
  # update coveralls.io
  - coveralls
notifications:
  # spare me from email notifications
  email: false

Reload web app

The after_success step includes the call to .travis/reload-webapp.py, which is a Python script to reload the web app via the PythonAnywhere API. This is more or less copied directly from the documentation.

# .travis/reload-webapp.py
"""Script to reload the web app via the PythonAnywhere API.

"""
import os
import requests

my_domain = os.environ['PYTHONANYWHERE_DOMAIN']
username = os.environ['PYTHONANYWHERE_USERNAME']
token = os.environ['PYTHONANYWHERE_API_TOKEN']

response = requests.post(
    'https://www.pythonanywhere.com/api/v0/user/{username}/webapps/{domain}/reload/'.format(
        username=username, domain=my_domain
    ),
    headers={'Authorization': 'Token {token}'.format(token=token)}
)
if response.status_code == 200:
    print('All OK')
else:
    print('Got unexpected status code {}: {!r}'.format(response.status_code, response.content))

Set environment variables

To make all this actually work, you need to set some environment variables in the Travis project settings. Namely PYTHONANYWHERE_DOMAIN, PYTHONANYWHERE_USERNAME and PYTHONANYWHERE_API_TOKEN.

Also, don't forget to set DJANGO_SECRET_KEY!

Summary

These are the resources you need:

PythonAnywhere

Travis CI

Future

I need to look into Travis's Script deployment which looks like a much cleaner way to run the deployment commands.

comment!

If you find the one error that I missed, please tell me about it!

Use Django in-memory file storage with pytest

In my current project, I create PDF files from Jinja2/LaTeX templates. In each test run, several PDFs are created and saved to disk. How do you test this without filling up the hard drive?

I use an in-memory data storage. For Django there is a package that makes it really easy: dj-inmemorystorage.

A non-persistent in-memory data storage backend for Django.

Using pytest fixtures:

# tests/conftest.py
import pytest
import inmemorystorage

from django.conf import settings

@pytest.fixture
def in_memory():
    settings.DEFAULT_FILE_STORAGE = 'inmemorystorage.InMemoryStorage'

That's it. When using this in_memory fixture in a test function, the files will never be written on disk.

Neue Podcastfolgen Juli 2017

Nach langen Monaten habe ich diese Woche gleich zwei neue Podcastepisoden veröffentlicht.

Mexiko

Schon im Januar hatte ich die zweite Folge von Tacos und Limetten aufgenommen. Es wurden zweieinhalb spannende Stunden über die Geschichte und Gesellschaft Mexikos. Außerdem gibt's Reisetipps!

C3S

Und auch der C3S-Podcast ist wieder am Start. In der neuen Episode berichtet mir m.eik von der letzten Generalversammlung und den Wirren der deutschen Gesetzgebung in Sachen Verwertungsgesellschaften.

Install an extra LaTeX font package on PythonAnywhere

This week I installed a LaTeX font package in my PythonAnywhere account. The TL;DR: just install it manually.


I've been using LaTeX on and off for more than 10 years now. But I never dove into the depths of the system, typesetting mathematical expressions was fun enough for me.

So I rely on Google a good bit. But that's fine, as there are always new things to discover. Plus: the PythonAnywere support is superb! I had a very helpful email exchange with Giles. Thanks!

Using the TeXLive distribution on my Mac, and on the Ubuntu server that runs PythonAnywhere, the way to install packages nowadays is to use the tlmgr command, which I can only assume to be an abbreviation for TeX Live manager.

$ tlmgr install roboto

Well, that didn't work, because I am not root nor sudo. Turns out, there is a tlmgr User Mode which, after initializing a local user tree

$ tlmgr init-usertree

allows me to install additional LaTeX packages into my home directory under ~/texmf. So:

$ tlmgr --usermode install roboto

should have done it. Unfortunately I got the following error message.

$ tlmgr --usermode install roboto
/usr/bin/tlmgr: could not find a usable xzdec.
/usr/bin/tlmgr: Please install xzdec and try again.

Thanks to Giles, it was easy to install the missing xz package.

$ wget https://tukaani.org/xz/xz-5.2.3.tar.gz
$ tar xf xz-5.2.3.tar.gz 
$ cd xz-5.2.3/
$ ./configure --prefix=$HOME/.local
$ make
$ make install

That actually worked. But then tlmgr spit out this:

Unknown directive ...containerchecksum
06c8c1fff8b025f6f55f8629af6e41a6dd695e13bbdfe8b78b678e9cb0cfa509826355f4ece20d8a99b49bcee3c5931b8d766f0fc3dae0d6a645303d487600b0..., please fix it! at /usr/share/texlive/tlpkg/TeXLive/TLPOBJ.pm line 210, <$retfh> line 5761.

This is a clear case for Google, and Google didn't disappoint. The installed version of Tex Live is from 2013 and doesn't work with the current package repositories.

$ tlmgr --version
(running on Debian, switching to user mode!)
tlmgr revision 32912 (2014-02-08 00:49:53 +0100)
tlmgr using installation: /usr/share/texlive
TeX Live (http://tug.org/texlive) version 2013

After setting the repository to an old archived one,

$ tlmgr option repository ftp://tug.org/historic/systems/texlive/2013/tlnet-final

the TeX Live installation was happy.

But, it turns out, the 2013 repository doesn't even include the roboto font package. So…

TL;DR

… in the end, I installed the font by hand, following the instructions here: http://www.ctan.org/tex-archive/fonts/roboto/.

$ wget http://mirror.ctan.org/install/fonts/roboto.tds.zip
$ cd ~/texmf
$ unzip ~/roboto.tds.zip
$ texhash
$ updmap --enable Map=roboto.map

And now I have the roboto font available for pdflatex on PythonAnywhere!

Populate your Django test database with pytest fixtures

I'm working on a side project that uses data from an external API. For performance reasons I store this data in a local database. But when running pytest, all my tests always start with a clean database. That's not good, as I need the data to run many of the tests, and adding it from the API is very time consuming.

Of course, Django has a solution for this, confusingly called fixtures, and pytest has a way to use Django fixtures in a custom pytest fixture to populate the database with initial test data.

Because it took me a while to find this, I document it here. It works like this:

Dump the data

Using Django's own dumpdata management command, you dump all or selected tables from your local database into a JSON file in a subfolder of the app named fixtures. My Django app is called potatoes, and I want the data for my two models Potato and SturdyPotato.

$ ./manage.py dumpdata potatoes.Potato potatoes.SturdyPotato -o potatoes/fixtures/potatoes_data.json

Load the data

The corresponding loaddata command can be used with pytest's django_db_setup fixture to load the data into the test database.

# tests/conftest.py

import pytest

from django.core.management import call_command

@pytest.fixture(scope='session')
def django_db_setup(django_db_setup, django_db_blocker):
    with django_db_blocker.unblock():
        call_command('loaddata', 'potatoes_data.json')

Use pytest fixture

Now, in every test that needs it, I use this session-scoped fixture, and the test data is available.

# tests/test_models.py

def test_my_potatoes(db, django_db_setup):
    # GIVEN a full database of potatoes, as provided by the django_db_setup fixture
    all_my_potatoes = Potato.objects.all()

Disable Django/Python logging with pytest fixture

Yesterday, I added Sentry error tracking to my Django app, and configured it to register every log entry with level INFO and above. Now, everytime I ran my test suite, there were events logged with Sentry that I didn't really care about. Naturally, I wanted to disable the default logging behavior for tests.

StackOverflow, naturally, provides part of the answer:

logging.disable(logging.CRITICAL)

will disable all logging calls with levels less severe than or equal to CRITICAL.

(http://stackoverflow.com/a/5255760)

But how to run this on every test? Pytest to the rescue! I use an autouse fixture:

  • if an autouse fixture is defined in a conftest.py file then all tests in all test modules below its directory will invoke the fixture.

And this is what I put into my conftest.py files:

@pytest.fixture(autouse=True)
def disable_logging():
    """Disable logging in all tests."""
    logging.disable(logging.INFO)

That's it. Love it!

How to install a Python virtual environment on macOS

This one is for my amazing designer Angélica. I should have written it before I failed to install a Python virtual environment on her machine this week.

First of all: trust me, when I tell you that you want to use a virtual environment for your Python work. Second: there are many ways to install and use virtual environments. This one works for me(TM).

Install homebrew

Homebrew is a package manager for macOS that allows us to install a current version of Python, e.g. Python 3.6 at the moment. This is what we want.

Start your terminal.app and copy and paste the installation command from the Homebrew website into it. Then hit enter.

Install Python3

When homebrew is installed, stay in the terminal.app and install Python3 using this command.

$ brew install python3

Now, the commands python3 and pip3 are available on the command line. You can check the installed Python version with

$ python3 --version
Python 3.6.0

Install virtualenvwrapper

Next, use pip3 to install the virtualenvwrapper tool (Official documentation) that makes working with virtual environments easy. I don't even know how much easier, because I only ever use virtualenvwrapper.

$ pip3 install virtualenvwrapper

Open the file /Users/<your_username>/.bashrc in your text editor (like SublimeText), and add the following lines at the bottom.

export WORKON_HOME=$HOME/.virtualenvs
source /usr/local/bin/virtualenvwrapper.sh

Quit and reload the terminal.app

Create a virtual environment

Finally, we can create a virtual environment. Go into our project directory (e.g. ~/code/secret_project/,

$ cd code/secret_project/

type into the terminal,

$ mkvirtualenv --python==python3.6 secret_project

and hit enter. This creates a new virtual environment in

~/.virtualenvs/secret_project

Activate virtual environment

You activate it with

$ workon secret_project