Wednesday, October 26, 2016

Orchestrate The Cloud: Part 1 - Docker

With the recent surge of DevOps practices sweeping the tech world, we also ring in the new and improved strategies that orchestrate our cloud. If cloud practices were cars then we'd be upgrading from an old diesel BMW to the Tesla D. As with any movement in the tech world there is a boom with different services offering nearly identical features. However in this 3 part series I'm going to talk about the tools that I think best meet the needs of tomorrow's cloud. These articles will contain lots of personal bias (which I'll do my best to point out in the moment) and lots of logical/technical analysis. So buckle up and get ready for this 3 part series, "Orchestrate The Cloud"!

Part 1: Docker

Chances are if you work in the cloud today you've been hearing the hot new buzzword `Docker` for a while. Let's take a minute to analyze just WHAT Docker is and how it's changed the way we think about running services in the cloud, or anywhere for that matter.

Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run: code, runtime, system tools, system libraries – anything that can be installed on a server. This guarantees that the software will always run the same, regardless of its environment. -
As you can see from the quote above, Docker containers provide a great abstraction that pulls us away from the traditional cloud server construct. We are no longer concerned with installing web servers, interpreters, clients and other services on the server that hosts our container which increases the flexibility with which we can run our service and migrate our service between different environments or service providers. Docker containers also guarantee that code will run in an identical environment no matter where you install the container, which means no more errors when you deploy your code to a new server that's running a different version of node.js/php/python or whatever language you prefer. All of the services your code needs to run will be contained in your pre built docker image and remain untainted by the host machine.

How do I even Docker?

Creating a Docker image is actually quite simple once you have docker installed on your favorite mac or pc. You simply create a file, generally called Dockerfile, and you write the instructions that will be used to build your  image into the Dockerfile.


Sample Dockerfile
#The FROM directive can pull in a base image from to work with
FROM node:0.12

#The COPY directive allows you to copy code on your host machine into the docker image

#The RUN directive allows you to run shell commands inside of the image
RUN npm install -g express_mvc
RUN emvc bootstrap myserver

#The CMD directive sets the command that will run when the docker image is started
CMD node myserver/app

As you can see from the file above, we're creating a base image using nodejs, then we copy our myfiles dir into the /srv folder, then we effectively cd into the /srv/myfiles dir to npm install. CMD won't be run until we run our docker image. To Build our docker image we simply run the following command...

Docker Build Command
docker build -t myimage:latest .

That command will build an image named myimage and give it the tag latest (You can give an image as many tags as you like, great for versioning). Now that we have an image built, we can run it!

Docker Run Command
docker run -d -p 80:3000 myimage:latest

Now that our Docker image is running we should be able to access our webserver over localhost port 80, that we mapped in the run command. Now that we've seen the power of Docker on it's own, we're now open to all the possibilities it creates for us to Orchestrate The Cloud....

Stay tuned for Part 2 - Kubernetes, where I'll talk about deploying your Docker images to clusters in the cloud that can manage resources, uptime, scaling and report metrics and logs.

Saturday, January 24, 2015

The best way to run your code online

I've been a huge enthusiast of the online code sandbox movement that has become so popular. It's a great way to share code with friends, answer questions on your favorite message board and test a quick hypothesis on the fly. However, this space has been dominated by apps that only cater to the front-end experience (HTML/CSS/JS) for too long.

Enter Runnable.

Runnable is the solution to many of your on the fly code testing and code sharing needs, all in the browser! They currently support the following languages/frameworks:

Not only do they provide these stacks in an easy to use manner for all you GUI users out there, but they even provide a terminal option that allows a VI Junky like myself to work in a comfortable environment.

Once you've written your code in the language of your choice, you are then free to compile/run your code and see the output instantly in your browser.

I highly suggest creating an account TODAY. You won't regret it. - Run your code online

Wednesday, November 12, 2014

Node.js Request Memory Leak

I have been a Node.js enthusiast for the past few years and have always loved it's ability to spawn numerous requests in parallel. This ability has allowed me to write software that far exceeded my expectations of "fast".

As I've moved forward in my career I've come to a place where I'm working with Big Data now. These days getting my feet wet consists of processing millions of records through these Node.js apps and often times making hundreds of thousands of http requests in the process.

On a recent project I noticed that while looping through a 2.5 Million item XML Feed I was getting a consistent memory growth (memory leak, possibly), until the app would eventually run out of memory then start back at 1... X( This was very frustrating and a bit hard to debug. Especially with the prevalence of 3rd party modules used, the leak could have been anywhere.

After hours of debugging I narrowed the issue to a new http request being made 1 time per record and decided to focus my energy on that. I then learned what the real problem was, and it's quite simple and easy to fix!

In Node.js the http requests use a connection pool managed by "http.Agent" with the following properties:
  1. agent.maxSockets
  2. agent.sockets
  3. agent.requests
agent.maxSockets sets how many connections per address can be open at once and is defaults to 5

agent.sockets is an object that contains the currently used sockets

agent.requests is an object of requests in queue waiting for an open socket

So picture this... You're processing thousands of items per second and sending out http requests but only 5 of those requests are getting set to a socket at a time, and the rest are piling up in your agent.requests object. Chances are very good that agent.requests object is going to keep growing and growing...and growing, until it eventually consumes all the memory in your system. This was the case for me.

Have no fear! The fix is easy:

Simply create your own instance of http.Agent as follows:

New Custom Connection Pool
var pool = new http.Agent;
pool.maxSockets = 1000; // play with this number to adjust memory consumption for your app

then in your request options:

Add Pool to Request
    url: '',
    pool: pool
}, callback);

That should do the trick!! Please share and comment. Let me know if this helped you!

Monday, May 26, 2014

Using Point of Sale Hardware in the Cloud

I recently started a new venture that required the use of a USB Barcode Scanner and Credit Card Reader on a website with only Javascript to process them and found it's much easier to handle than one might think.

The first thing you need to understand is this: these USB devices act exactly like keyboards.

  • They scan your barcode/credit card
  • They transmit the scanned data via `keypress` events
  • Once data is finished and keyCode 13 is sent

Using those rules above I started a simple, yet effective POS plugin built for jQuery, which you are free to use, upgrade for your own needs or contribute to.

Learn more on the gitHub page:

Known Issues:

  •  Currently the plugin only supports devices which return a keyCode 13 at EOD
  • Currently the barcode scanning plugin by default only supports digit only barcodes, but allows Regular Expression options to support other types, if overridden by you

Wednesday, April 9, 2014

OpenSSL Heartbleed Bug - Keep Calm Edition

I've been reading lots of articles about the heartbleed bug and even had to address the potential threat in my own systems. So, I'd like to clear up some misconceptions I have found floating around.

I've read articles claiming that this is a bug in systems built before April 7th, 2014, which is not completely true.

  • OpenSSL 1.0.1 through 1.0.1f
  • OpenSSL 1.0.2-beta

If you are running one of those versions, check with:
openssl version -a
you will want to update to the latest version of openssl ASAP.

Read more here:

Thank You for reading,

Keep Calm and Carry On

Monday, January 27, 2014

SSL Crisis Aversion Made Easy - For Apache

You know what I like about you...? You're here reading this post with the hopes that whatever problem you are trying to solve can be alleviated with a little extra knowledge.

I wish I would have been as smart as you the first time I tried to install an SSL Cert on our server. Let me rewind real quick...

When my company's site was created the CTO and Senior Developer at the time installed our SSL Cert. Now fast forward to a year later... Both of them are gone and the task now falls to me. Now that you have some context on the situation you can appreciate how much it sucked.

I figured okay, I'll copy this new cert from GeoTrust and paste it into the old file and boom we'll be rocking secure again...right? WRONG!

Apparently there's these things called, Intermediate Certificates, and guess what...those aren't your main certs. Basically I installed the Primary Intermediate Certificate as mysite.crt then restarted apache and got this funky error that looked something like this:

Error: "Unable to configure RSA server private key" 
Eror: "mod_ssl: Init: ( Unable to configure RSA server private key (OpenSSL library error follows)" 
Error: "OpenSSL: error:0B080074:x509 certificate routines:X509_check_private_key:key values mismatch"

So now your website is down because apache can't recover from that very gracefully. If this happens make the quick save and run sudo a2endismod ssl followed by sudo /etc/init.d/apache2 restart

At least now the site is up again...right? RIGHT! ...hopefully :-/

Then I found this little gem which made me realize how simple the solution was:

Basically what that link does is allow you to paste in the code from your .key file in one textarea and your .crt file in another and it will tell you if they're a match. It's super easy, even easier than using openssl on your server to check first.

I found out that I was using the wrong .crt and once I replace it with the proper cert I was up and running in no time. Crisis averted! #downtimeisdeath #fixftw

I hope this little snippet helps all you other SSL newbs out there.

Please comment and share.

Also, stay tuned for my next article on setting up SSL for an nginx server with a reverse proxy to Apache.

Wednesday, October 30, 2013

Simple Bootstrap 3.0 Collapsable Panels (UPDATED)

I recently used a bootstrap panel for some features on a new project and really loved the way they look. However, I thought it would be even cooler as a collapsable feature, but didn't like the number of dom changes I need to make a panel use .collapse() each time. So I started a bs_addons library and here's the first feature for you!

To follow the project and/or contribute, head over to github:

Otherwise...Keep reading:
(OLD POST DEPRECATED) You can find updated info on using this plugin at: