Wednesday, October 26, 2016

Orchestrate The Cloud: Part 1 - Docker

With the recent surge of DevOps practices sweeping the tech world, we also ring in the new and improved strategies that orchestrate our cloud. If cloud practices were cars then we'd be upgrading from an old diesel BMW to the Tesla D. As with any movement in the tech world there is a boom with different services offering nearly identical features. However in this 3 part series I'm going to talk about the tools that I think best meet the needs of tomorrow's cloud. These articles will contain lots of personal bias (which I'll do my best to point out in the moment) and lots of logical/technical analysis. So buckle up and get ready for this 3 part series, "Orchestrate The Cloud"!

Part 1: Docker

Chances are if you work in the cloud today you've been hearing the hot new buzzword `Docker` for a while. Let's take a minute to analyze just WHAT Docker is and how it's changed the way we think about running services in the cloud, or anywhere for that matter.

Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run: code, runtime, system tools, system libraries – anything that can be installed on a server. This guarantees that the software will always run the same, regardless of its environment. -
As you can see from the quote above, Docker containers provide a great abstraction that pulls us away from the traditional cloud server construct. We are no longer concerned with installing web servers, interpreters, clients and other services on the server that hosts our container which increases the flexibility with which we can run our service and migrate our service between different environments or service providers. Docker containers also guarantee that code will run in an identical environment no matter where you install the container, which means no more errors when you deploy your code to a new server that's running a different version of node.js/php/python or whatever language you prefer. All of the services your code needs to run will be contained in your pre built docker image and remain untainted by the host machine.

How do I even Docker?

Creating a Docker image is actually quite simple once you have docker installed on your favorite mac or pc. You simply create a file, generally called Dockerfile, and you write the instructions that will be used to build your  image into the Dockerfile.


Sample Dockerfile
#The FROM directive can pull in a base image from to work with
FROM node:0.12

#The COPY directive allows you to copy code on your host machine into the docker image

#The RUN directive allows you to run shell commands inside of the image
RUN npm install -g express_mvc
RUN emvc bootstrap myserver

#The CMD directive sets the command that will run when the docker image is started
CMD node myserver/app

As you can see from the file above, we're creating a base image using nodejs, then we copy our myfiles dir into the /srv folder, then we effectively cd into the /srv/myfiles dir to npm install. CMD won't be run until we run our docker image. To Build our docker image we simply run the following command...

Docker Build Command
docker build -t myimage:latest .

That command will build an image named myimage and give it the tag latest (You can give an image as many tags as you like, great for versioning). Now that we have an image built, we can run it!

Docker Run Command
docker run -d -p 80:3000 myimage:latest

Now that our Docker image is running we should be able to access our webserver over localhost port 80, that we mapped in the run command. Now that we've seen the power of Docker on it's own, we're now open to all the possibilities it creates for us to Orchestrate The Cloud....

Stay tuned for Part 2 - Kubernetes, where I'll talk about deploying your Docker images to clusters in the cloud that can manage resources, uptime, scaling and report metrics and logs.