Deep learning is the "new" trend, but more than a trend, related tools start to be quite mature. Convolutional neural networks (CNN) are particularly interesting and are a great source of research in visual recognition. With the help of Docker, it is now very easy to setup an up and running machine to train your models. In this post, I'll show how to do this very easily as Google already made the job for us ! Indeed, you could find a Docker image of Tensorflow with lots of tools here

1- Install Docker

The procedure for installing Docker is very well explained on their web site. Once done, don't forget to add your current user to the docker group (which might already be created during installation):

  1. Create the docker group: sudo groupadd docker
  2. Add the current user to the group: sudo usermod -aG docker $USER
  3. Logout and login

2- Build your own Docker image

For our first CNN training, we'll use Tensorflow along with Keras which I find being a great API. You could find the Docker image of Tensorflow already built by Google here. We could use this image to built our own image with all the tools we need. I used this Dockerfile (DockerfileTensorFlow.cpu) to build mine:

First, Keras API is installed with pip. As my data are stored in a SQLite database, we must install it along with mysql packages. For data model manipulations, SQLAlchemy is a great ORM that simplify the job. Then, we install a X-server to be able to eventually display some stuff from the container in the host. And finally, I've added a bash script that will be executed once the container is launched. This script simply check an environment variable and launch the container with Jupyter Notebook if the variable is set as notebook:

Notice that I used the CPU version of the image for Tensorflow. You could decide to use the GPU version, but my laptop is much more efficient on CPU than on GPU. The difference is on the base image first. You'll have to use the latest-gpu instead of the latest as I did. Furthermore, as we want to use some CUDA libraries, we must launch Docker with the command nvidia-docker instead of simply docker.

3- Start the container

I'm also a great fan of fabric,  which is a CLI tool for SSH in python. I am not using it the proper way as it is usually used to deploy some script on multiple remote machines, but I like using it in local mode instead of hving multiple bash scripts. All I need is then gathered in a fabfile with all the commands I need. For example, here is a task I use to build my docker image:

Once the image is built, I run it in the mode I've chosen, i.e. with Jupyter Notebook support or not:

So, now that some tasks have been defined all we have to do is:

  1. Build the image: fab build_docker_image
  2. Start the container: fab tensorflow_cpu:notebook

And you're done! All we need now is to try our first CNN with this setup

4- CNN example

Deep learning has its own "Hello World" and it's called "MNIST". MNIST is a dataset of hand written number, all labeled. It is so popular that almost every API suh as Keras embeded it. The following is taken from the examples coming with Keras.

The images are in B&W so there is only one channel, but we need to flatten them to send them to the API:

The data is labeled, but the label is directly the value of the number it represents. However, we need to split them into 10 different classes. This is done easily in Keras with the help of

Now it's time to define our model. Here again, a simple one do the trick for the example

I won't detail all the layers but you could easily find the meaning of each one if you're not used to.

We're almost at the end. We need to compile the model and train it !

And finally, the model is evaluated with the test data to look at its performaces :

We can also plot the history of the training, i.e. the loss and accuracy for each epoch

You can find the sources on Github, and to be more fun, you could run an instance of a high end machine on Paperspace !