Deploying Your Flask Website with Python and Fabric

By the end of this post, you will be able to deploy your local code to any number of remote servers with a single command. This assumes, of course, that you have configured your web server with uWSGI to host your Flask website and you have shell access to this server. It is encouraged that prior to proceeding you at least understand hosting a Flask site, otherwise a lot of this might not make sense, as most details relating to hosting are omitted here (particularly the idiosyncrasies of of using a virtualenv in your hosting environment).

If you do not have shell access to your web server, maybe someone else can help. Maybe you need to check out the myriad of cloud hosting services out there (e.g. Amazon Web Services, which I use to host this blog, among other sites) and make the jump. You will learn a lot more this way and be able to do much more interesting things!

So, we have a little Flask website that we wish to be able to develop locally and then later deploy to our live (perhaps production) site.

First, locally, we should install Fabric, this can be in a virtual environment or…

Well, let’s assume that we have a Flask site, example_site, we’re going to first require a setup.py.

What we’re doing above is saying that our Flask site, example_site, is a package in a directory with the same name. We define our requirements, package name, etc. here.

After we have a nice setup file, we can start to script what’s called our fabfile. We must define two actions for the fabfile, a pack and a deploy function.

pack() defines how to ball up our sources.

deploy() defines how to push it to the server and what to do once our sources are there.

I chose to configure my fabfile like below, however the Fabric documentation offers the whole suite of features that are available to you.

Once we are satisfied that our deploy scripts look good, we are ready to rock!

We can execute our pack and deploy in a single command:

That’s it! Fabric will then ball up your sources, upload them to your remote server, and execute a script that handles those sources on the remote end. Once this is configured properly, developing is bliss–no matter what solution you choose, Fabric or otherwise, easy deployment of local code is incredibly important for your web project.

It allows you to focus on code and not operations (that are repeatable and trivial).

Configuring multiple Flask sites with uWSGI and nginx on an Amazon EC2 Instance

For the lazy, we have a shell script that will do all this for you on GitHub!

While trying to configure our new Amazon EC2 instance, it was a little too cumbersome and somewhat poorly documented how to setup Flask with uWSGI on nginx. While there were a few great writeups on how to configure this, they fell short to get a working configuration on Amazon’s Linux AMI, which we will try to describe here.

After creating a new EC2 instance and logging in, we will need to install nginx, uWSGI and required libraries.

Now we need a Flask project! Let’s just grab the latest simple repo from GitHub:

Alternatively, you can fork the project on GitHub and pull that one–this is probably the desirable option.

We will quickly setup a virtualenv in our new simple repository.

Now let’s give it a test, shall we?

Looking good, let’s move it real quick to where we want it and fix the permissions to our liking.

Next, let’s configure uWSGI. Following Tyler’s writeup, let’s make some directories.

Now let’s create the uWSGI configuration file with:sudo vim /etc/init/uwsgi.confWith the following content:

A few things to note here, we are setting up the general uWSGI emperor log and telling it to run as the nginx user for permission purposes.

With uWSGI configured, we can start it up withsudo start uwsgi

Now we can begin to configure our simple fork which runs this blog by creating a new file with:

And we will configure it with the following content:

Now we need to link our configuration file to the enabled sites folder:

Finally, we can configure nginx to serve up our new blog.

And we will configure nginx to serve up content from our uWSGI emperor like so:

Save the file and let’s fire up nginx, we are ready for launch!

Hope that you have found this post helpful. Later we hope to describe how to automate this on Amazon EC2 for automatic scaling of your server fleet.