You’ve probably heard of Docker, but don’t have the time needed to learn or implement it - you probably also want to start automating your builds, because who likes to deploy code? In this post, we’ll cover using Docker to deploy code using Bitbucket Pipelines. It will be high level on what docker is and enough to either use someone else’s Docker image or build your own - without needing to install Docker on your machine. We will then use that image to deploy our projects source code using Bitbucket Pipelines.
What is Docker
The technical answer is a containerization technology which isolates environments, resources, etc. For the rest of us who just want to use it for deploying code the best way to describe it is a lightweight virtual machine - it’s not a virtual machine, but that helps get past the first mental block. Where it differs from a virtual machine is that it communicates to the host kernel and its ram direct, while keeping everything isolated. Additionally, Docker containers boot super quick - the ones that I’ve used take about a second or two - as oppose to the minutes with a virtual machine.
Another difference is that the Docker images can be scripted in ‘Docker files’ and then added to source control. They can even reference different images to build off of. This means that if there is an image ‘close’ to what you want, but missing a few things you could simply reference the image in your new image and inherit it.
Here’s an example of a Docker file which is based on Ubuntu 16.04 and has wget installed on it:
FROM ubuntu:16.04 # Update and install wget. RUN apt-get update RUN apt-get install -y wget
That’s it - you don’t need anything more than that to use the image - if that’s all that you need of course.
Building the image
There’s a couple ways to do this, if you have a Windows 10 Pro, Mac, or Linux machine you could build it there after installing Docker. If all you want to do is use the Docker file to deploy your code then that’s a lot more work - that and I use a Chromebook most of the off hours so that doesn’t help me. So instead of that, open up either Bitbucket or GitHub in the browser and create new repository. Then create a “Docker” file and past the code from the example in it and commit the file.
Now head over to Docker Hub and create an account, then a repository, for this tutorial call it base. Then select created automated build. It will prompt you to ‘link’ your Bitbucket or GitHubaccount to Docker Hub. What this does is allow Docker hub to trigger a build everytime you commit . You can adjust the settings, but we are going to run with the default settings to make it easier and select the Bitbucket or GitHub repository you created earlier.
That’s it - now when you commit to the Bitbucket or GitHub repository Docker Hub will start an automated build for you - all without Docker installed.
Extending an image
Let’s pretend that you decide for a one off project that you need hugo. Instead of installing it on your base you can simple have another image based on your prior image. This would be done with the same steps as above - go ahead and call it ‘baseHugo’ for this tutorial.
FROM youUserName/base # Download and install Hugo RUN wget https://github.com/spf13/hugo/releases/download/v0.31/hugo_0.31_Linux-64bit.deb RUN dpkg -i hugo*.deb
Good news is your almost there, at this point you’ll probably want to have something to deploy the code to an ftp server, S3 bucket, or firebase hosting - but I’ll leave that to an exercise for the reader based on your needs. I personally use Firebase, so my Docker image has also node and the Firebase CLI tool installed.
I use Bitbucket’s pipeline offering for my code deployments - they give you 50 minutes a month and that is typically enough for my needs. One of the things Pipelines allows is to use Docker images from Docker Hub. Just create a bitbucket-pipelines.yml file in your projects git repository with your values in place of those below.
image: youUserName/baseHugo pipelines: branches: master: - step: script: - hugo --baseURL https://www.YourURL.com/ - firebase deploy --token [token goes here]
That’s it, every time you commit your code will now deploy.
Keep in mind the firebase deploy step is unique to my needs and I have it as part of my image.
After this you should be able to build Docker images from Bitbucket or GitHub repository’s and then use Bitbucket Pipelines to deploy your code. Remember deploying is not the only thing that you can do with Pipelines or Docker - it’s just the scope of this tutorial. Go ahead and experiment with your newfound Docker powers - you’ll be amazed with what you come up with.
Always follow the manufacturers instructions, this write up is simply the way that I do it, and it may or may not be the right way. Use your common sense when applying products to or altering your stuff and always wear the appropriate safety gear.