Using Grunt to Auto-Restart node.js with File Watchers

Preprocessors have become a very important part of the development life cycle. In the past we would just write some HTML, JavaScript and CSS with a backend and deploy a website.  Now for better speed, development experience and more manageable outcomes we have a multitude of languages that compile into these standards, e.g: CoffeeScript--> JavaScript, LESS --> CSS, Jade --> HTML... Then there is JS, CSS and HTML compression and minification after that.

SSH Keys and ssh-agent on Windows/Linux/Mac

Using SSH to securely connect to remote servers is well understood by anyone needing to do it. However, when you ask people to use a public/private keys to connect, it starts to get a little murkier - mainly because the process is not well understood by some folks. This article breaks down the confusion and simplifies the process to a point where you'll wonder why you haven't been doing this all along.

Using Inner-Classed Enums with ColdFusion and Java

I recently had to call a method from Java that required an Enum type. In my case I did not have control over the java code and the Enums were stored inside a Class called Enums. It was not readily apparent to me on how to access these inner-classed Enums and there wasn't any specific documentation that I could find on the topic. I struggled a bit with trying to make a string work (the true end result), using JavaCast and trying to instantiate the class and Enum directly. Thankfully the final solution was actually easier than expected. Learn how I solved this problem!

Front-End Resource Roundup

Note: this is a list of resources I've used for clients in the past to help other developers beginning to learn front-end tools get up to speed Here is a list of resources for you to use to become more familiar with html, css & javascript. The resources are categorized  by the amount of time or difficulty required for adequate digestion of the information. The easiest/shortest is surfing, resources you can skim through in minutes as you need them. Next is snorkeling, resources better digested over a period of about an hour or more to really go through Lastly, deep diving, resources that cover topics in depth in several hours or more (this doesn't mean you can't pick and choose chapters or sections as you need them, though!). 

General Resources:

HTML (Structure):

CSS (Appearance):

JavaScript (Functionality): 

UX/Usability/Best Practices:

Some Tools:

Don't be intimated by the vastness of the resources outlined here. We suggest for each topic you want to dive in to, take a look, see what appeals to you, and jump in. Then, over time, periodically refer to the deep diving sections to further your skills. I hope this is helpful! Enjoy!

LESS: Resource Roundup

The resources in this post accompany a brief presentation, slides here: This will be growing over time as I add to it.

How to Compile LESS

In addition to using, there are various client side tools to compile your LESS for use in projects.


  • CodeKit ($25) - Please, just do it. If you're on the Mac, this app is for so much more than LESS.
  • Live Reload ($10) My understanding it is quite similar to Codekit but not as full featured
  • (FREE!) By the same guy who does Codekit. evolved into Codekit.
  • Simpless (FREE!) Simple tool for compiling, also allows minification.


  • Simpless (FREE!) Simple tool for compiling, also allows minification. I used this when I was a Windows user.
  • WinLESS (FREE!) I've also used this. Doesn't offer simplicity of other tools, but has more robust error reporting.

Where to Learn More

Adding a Polygon Search to CFMongoDB

CFMongoDB rocks my socks...when I need to use ColdFusion. I noticed that the CFMongoDB doesn't natively support a polygon search :( It's ok though because I wrote one and as far as our team's testing is concerned, it works.

I added a method to DBCollection.cfc directly under the find method called findWithin_Poly. It follows.


function findWithin_Poly(string loc, array poly){
  polyConverted = CreateObject("java","java.util.ArrayList").init(ArrayLen(poly));
  for ( i =1 ; i lte ArrayLen(poly) ; i++){
  polyObject = variables.mongoConfig.getMongoFactory()
              .getObject("com.mongodb.BasicDBObject").init("$polygon", polyConverted);
  within = variables.mongoConfig.getMongoFactory()
              .getObject("com.mongodb.BasicDBObject").init("$within", polyObject);
  query = variables.mongoConfig.getMongoFactory()
              .getObject("com.mongodb.BasicDBObject").init(loc, within);
  var search_results = [];
  search_results = collection.find(query);
  return createObject("component", "SearchResult").init( search_results, structNew(), mongoUtil );

As you can see, the process is very straight forward. I build up a BasicDBObject according to the Mongo Java API and call a slightly different find method on the Mongo collection.


The loc parameter is a string that represents the key of the document that holds the coordinates of the point. I used loc because that what I saw all over the documentation but it's clear about not needing to be called loc.

The poly parameter is an array of arrays. Each element of the poly array is itself an array of exactly two elements: an x and a y. It represents a list of points which describe the boundaries of the polygon.


The format of the poly parameter is important - as is the data in the collection. CF will automatically make your array elements strings, but they need to be doubles - java doubles.

To do this, cast them with javacast(). See the example below.


poly = ArrayNew(1);
polyA = ArrayNew(1);
polyA[1] = javacast("double",3);
polyA[2] = javacast("double",3);
polyB = ArrayNew(1);
polyB[1] = javacast("double",8);
polyB[2] = javacast("double",3);
polyC = ArrayNew(1);
polyC[1] = javacast("double",6);
polyC[2] = javacast("double",7);

You would then use this to call the  findWithin_Poly function.

hits = people.findWithin_Poly("loc",poly);

Now, I haven't done this yet, but I would recommend doing it: add a utility method that would cast all of the elements to double. Simple and reduces clutter.


2 Important (Overlooked) Features of ColdFusion 9

Recently I was chatting with someone on our dev team and I realized that not everyone is as heads-down, buried in ColdFusion every day like I am. New features come and go (do they go?) and we latch on to what we can, when we need it. But depending on what you were doing back when CF9 released, you may have missed these. Yes.  these are old features already (4 years)  but real nice features just seem to slip through without being harnessed. these are (2) big deal items I see overlooked that you really should use:

local variables in functions

We all know about using the var keyword to declare local variables. if you don't already know, look it up. In CF8, we had to do all these var declarations at the top of the function. Pain in the ass. someone kick the compiler guy/gal.

Leave it to our CF community to come up with a brilliant solution that is one of the more elegant "standards" I've seen evolve in the CF world (along with init() pseudo-constructors). All ya gotta do is add this code at the top of every function:

var local = {}; // structNew()? get hip, use the squirrelly brackets!

Then, in your function code when you need a local variable, just use local.

what sucked? well, you have to use local. prefixes on all your local variables which is particularly lame with an lcvs (loop control variables), like i,j,k etc... sometimes you'd see folks declare these as local separately just so they could reference it using #i# or the like.

Overall, nice solution. but that was pre 2009; (eh hem, 4 YEARS AGO) - what's the gig with CF9?

First of all, in CF9 we can declare a var anywhere in the function. 'bout time. the compiler guy/gal got the message. but that's only half the story.

Them Adobe folks also made an implicit local scope in all functions, called... (wait for it, wait for it... drumroll) local. Yea, they scarfed the "standard" and rolled it in. nice job Adobe (i don't say those words too often these days, so I had to throw it in). Well, anyway, what are the perks to this new scope? (2) things:

  1. You don't have to declare the local structure at the top of the function. it's already there (but if you do, it's backwards compatible, so no worries)
  2. You don't have to reference the local variables using local. once they are initially set in the local. scope, you just reference them directly and they are found in the scope chain


for(local.i=1;i<10;i++){ writeOutput(i); // works like a charm }

Implicit setters and setters

Here's one i didn't figure out until i had to teach a class in ColdFusion 9 - ColdFusion allows you to add implicit getters and setters with one easy attribute to the <cfcomponent> tag: accessors="true"

of course Ben Nadel crawled behind the dashboard to check the wiring on this - pretty cool experiment he did too. At the end of the day it's not a huge deal, but here's whay I like it. If forces me to do things "the right way" and it makes it easy. That's pretty much what CF is all about.

Mike Pacella, one of our CF/Java ninjas got me in the habit of using getters() and setters() a lot - it seems a bit uptight sometimes and quite frankly, you dont have to do it for everything - but if your talking about instance properties on an object, well, yeah, you probably should. but it's a hassle to write getters and it feels like a waste of time when those methods just get and set. I write enough code, i dont need to do more. shut up, I know i can use snippets or generators, or whatever. I do. it's still a pain in the ass.

so here's the layup - you add accessors=true to the <cfcomponent tage and all your properties have getters and setters available by default. AH BUT WAIT how the hell does the compiler know what the run-time instance properties will be. this is ColdFusion, we don't declare those things to our compiler. This is the other thing i love about this feature - we  actually use <cfproperty and now, its a tag that's actually used by the compiler ISO just being a documentation scrap. I'm a bit of a documentaiton nut, so I like the idea that if I create this tag and thereby write documentation for my doc engine parser, I'm also writing code that is doing something - in this case defining properties for the implicit getters and setters. putting this all togther,  here's a little cfc code you might see:

<cfcomponent accessors="true"> <cfproperty name="Globals" type="struct" hint="the global variables, by reference." />

<cffunction name="init" access="public" returntype="obj"> <cfargument name="globals" required="no" default="#{}#" hint="reference to global variables" /> <cfscript> setGlobals(globals); </cfscript> <cfreturn this> </cffunction> </cfcomponent>

so that's it. if you wanna mess around with it, here's some code.

Unsetting Response Headers in an Apache Reverse Proxy Configuration When Serving PDF to Internet Explorer

Yuck. IE. Sometimes, we just need to suck it up and support IE even if it goes against everything we believe in. It's well known that IE has a problem downloading PDFs over HTTPS when certain cache control headers are sent in the response.

See Microsoft's own support site for this one:

The workaround for a developer is to make those headers disappear. There are several ways you can do this.

  1. In your application, don't set them
  2. In your Web server, unset them.

Usually, the better approach is to configure your Web server to unset the header site-wide since the origin of the PDF behind your server is inconsequential. Removing them at the server level makes serving PDFs to all browsers work.

Unset Headers in Standard Apache Configuration

If you are using Apache in a standard way to front your application or site you can identify PDF requests using the Files directive.

<Files ~ \.pdf$>
   Header unset Cache-Control
   Header unset Pragma

You should place this in the <VirtualHost *:443> section since this is a problem related only to serving over HTTP.

Unset Headers in Reverse Proxy Configuration

If you are using Apache as a reverse proxy and your PDFs are not on the filesystem relative to the site root, then you need to match the PDFs differently. In this case, you need to use a Location directive since the Files directive is used to match unproxied files.

<Location ~ \.pdf$>
   Header unset Cache-Control
   Header unset Pragma

Again, this should be placed in the <VirtualHost *:443> section.

Convert MySQL Database Character Encoding to UTF8

Create a Database

To create a database that will default to UTF8 without modifying the server defaults run the following command


Broken down, it means:

CREATE DATABASE dbName - create a database named dbName (best name, ever)

DEFAULT CHARACTER SET utf8 - this database will use utf8 as its default character encoding. All tables created, will be configured to use utf8 for character columns.

DEFAULT COLLATE utf8_general_ci -  when comparing characters, ignore case (ci=case insensitive)

Convert an Existing Database

If you already have a database in some other character set, such as latin1, or ISO 8859-1, you can easily convert it to UTF8 by walking through a few simple steps.

1. Export your database

$> mysqldump -u -root -p \
       --complete-insert \
       --extended-insert \
       --default-character-set=utf8 \
       --single-transaction \
       --add-drop-table \
       --skip-set-charset \
 > dump.sql

This will include column names for your insert statements, efficient insert statements that don't repeat column names, export in utf, not include setting the existing charset, and include drop table statements prior to create table statements.

2. Edit the Dump to Use UTF-8

cat dump.sql | \
  > dump-utf8.sql

This will just swap out latin1 for utf8. You can accomplish however you want to. I'm a fan of sed and awk and all things command line but the goal is clear; make latin1 stuff, utf8 stuff. Just be careful to qualify your  replacements. You don't want to accidentally change data.

3. Drop Your Database

From the mysql prompt, run

> drop database dbName;

4. Import UTF8 Version

mysql -u root -p < dump-utf8.sql


That's it. You could easily make a little shell script to do this and when need to convert a database from one encoding to another, just run it. Perhaps the command would look something like:

swap-encoding dbName fromEncoding toEncoding

Google Discontinues Free Version of Google Apps for Business

On Thursday (12/06/12), Google announced changes to their Google Apps for Business offering. They will no longer offer a free version of Google Apps for Business. As of 12/06/12, any company that wants to use Google Apps will need to sign up for the paid version which costs $50 per user, per year. If you are an existing Google Apps customer who is taking advantage of the free version, nothing changes. You are essentially grandfathered in and will not be charged the $50 per user, per year fee. Google is offering existing users an upgrade path to Google Apps for Business at the rate of $5 per user, per year.

I should also note that Google Apps for Education is still available as a free service for schools and universities.

Troy Web is an official Google Apps Reseller If you need assistance with setting up Google Apps for Business, let us know. We can help with everything from simple account setup, to full-scale migration from your existing service provider.

How to Swap Out a Time Warner Modem With Your Own Faster One

If you live in an area serviced by Time Warner Cable and you subscribe to high-speed Internet access, you've recently been mailed a notice with the infamous announcement of their plan to begin changing a "leasing" fee of $3.95/month for your cable modem. The mailing includes a way to avoid the fee by buying your own cable modem but with a caveat that doing so may leave you in dust when they "update" their network and your modem no longer supports their new network technology. What they fail to mention is that over time, their equipment fails to give you the best experience and fastest speeds they have to offer but that you need to figure this out yourself and swap out your own equipment. When was the last time you ran some speed tests and researched Time Warner's current network offerings [in your area] so you could decide if it was time to swap out your equipment ?

There are several points to consider when making this decision, so I'll help lay it our for.


The cost of doing nothing is known outright: $3.95/month X 12 months = $47.40/year. Assuming they never ever increase that cost, you're looking at about $50/year. I could shovel my driveway or I could have it plowed for about $50 so you really need to ask yourself, what it is that you value more. That being said, once I buy the modem, I never have to buy it again. Once I shovel my driveway, I'll need to shovel it again :) So let's look at this a little differently. I've had Time Warner for 13 years now. If I was paying [an avoidable fee] all along, I would have donated $616.20 to Time Warner Cable by now.

Time Warner supplies you with a list of approved modems ( and the prices range from around $50 to a little over $100. However, I found the SB6121 for $53 online even though it retails at $109.  A little sweat equity goes a long way sometimes.

How "Tech-Savy" Are You?

The process of doing all this is remarkably simple. I have to hand it to Time Warner on this one. I went through the process and the hardest part of all was picking out the modem - and even that was pretty simple after I understood what I needed.

Once you've selected your modem (see below) all you have to do is plug it in and move the Ethernet cable from your old one to the new one. Then, call Time Warner (1-866-321-2225) and say you need to activate a new modem. Trust me, they're expecting your call. They know exactly what you need. They'll ask one important question: "What's the MAC address of your modem?" The MAC address, or Media Access Control number, is the way Time Warner authorizes your device to be on their network. To get this number, turn your modem over. It's plastered right next to the serial number, model number, and all the other FCC stuff. If you run into any problem, Time Warner can and will help you through it.


Unless you subscribed to Time Warner high-speed Internet in the last few months, it will almost certainly increase your bandwidth and consequently, your download speed - by a lot. Time Warner is in the process of upgrading their network to use the DOCSIS 3.0 protocol and without going into any detail, DOCSIS 3.0 is faster than DOCSIS 2.0, which most Time Warner Cable modems are able to handle. This boils down to you getting a faster download speed simply by using a modem that can handle it; without paying more. Yes, you could bring your current modem back and get one that has the DOCSIS 3.0 firmware on it, but you'd still be paying $3.95/month for it.

There are a lot of variables that determine network speed, such as which day and what time of that day but with a DOCSIS 2.0 (most Time Warner modems) and without turbo should give you about 10-15 Mb/second download and very close to 1Mb/second upload. With turbo, you're looking at around 2 Mb/second upload.

When I swapped out their 2.0 modem with my own 3.0 modem, I saw my download speed jump from 15Mb/second to 32Mb/second! This test has been repeated dozens of times with similar results. Never has my download speed dropped below 29.

Other Considerations

I don't want to get political but it's worth a mention that the new policy of leasing the modem has the potential to be discontinued down the road. They're already being sued because of it. "Send customers confusing notice of the fee in a junk mail postcard they'll throw in the garbage, sock them with a $500 million dollar a year rate hike, then announce on your website that customer satisfaction is your No. 1 priority. That's some way to deliver satisfaction." That was said by one lawyer initiating a class action law suite on behalf of a New York and New Jersey client. (,2817,2412196,00.asp)

For me, I see this as a way to take a little more control over my Internet access, increase my own speed, and avoid a little monthly fee. It's not much but then again, buying a modem and calling Time Warner really doesn't take all that much time.

What Model?

If you haven't done so, check out the ist of modem they allow and what features they offer. I don't need wireless access from my modem but you might. DOCSIS 3.0 was important to me and it should be to you too. Multiple ports was not important to me because I have a switch that goes to two routers. Spend ~$60 now or spend $600 over the next 13 years. The selection process really isn't all that complicated so long as you're realistic about your own needs.


Scheduling EC2 Backups with Skeddly (automated EBS snapshots)

A recent topic of conversation in the office has been backups. Three of us have experienced catastrophic hardware failures on our local development machines (i.e. our laptops). Thankfully, we are all obsessive about backups so we all got back up and running in no time. If you aren’t backing up your local system, then you need to read Sean’s excellent post on Selecting Cloud Backup Software. But what about your servers? And more specifically, what if you’re running an Amazon EC2 instance?

Like all things AWS, Amazon has many options for creating backups. If you setup your EC2 instance to use Elastic Block Storage (EBS), you can simply create a snapshot of your volume from the AWS Console. These EBS snapshots are incremental backups that persists on Amazon’s S3. Incremental means that only the blocks that have changed since your last snapshot are saved. This is all really slick, but manually creating snapshots from the AWS Console isn’t a good solution if your goal is to have daily, hourly, or whatever snaphots.

You could create your own service using command line tools. For example:

ec2-create-snapshot vol-id --description "Daily Backup"

Or, you could use Amazon’s API. For example: ?Action=CreateSnapshot &VolumeId=volume-id &AUTHPARAMS

The above options are really useful, but it takes a bit of time and fiddling to get everything right. The easiest solution I have found for scheduling automated snapshots is a service called Skeddly  Skeddly can do more than automate snapshots, but that’s what we’re going to look at in this post.

Using Skeddly

Sign Up and Create a Skeddly Specific AWS User

As of writing this post, you can get a 30-Day Trial of Skeddly, so go there now and sign up.

Before you sign up I would suggest creating an access key specifically for Skeddly. You do this by creating a user from the AWS Console under Identity and Access Manager (IAM). I created a user called skeddly with the following policy:

{ "Statement": [ { "Action": [ "ec2:CreateSnapshot", "ec2:DeleteSnapshot", "ec2:CreateTags", "ec2:DescribeInstances", "ec2:DescribeSnapshots", "ec2:DescribeTags" ], "Effect": "Allow", "Resource": "*" } ] }

If you need asstance creating a policy, you can use the AWS Policy Generator located at

Add an Instance

After logging into Skeddly, select the Managed Instances tab and Add Instance. This is the easiest way to create your automated snapshots. You will need to know the following before you can fill out the form:

Instance ID: Get this from your AWS Console under EC2. You will see an instance column that displays your instance ids. Your instance id should look something like i-1a2b3c4d.

Elastic IP: If you want Skeddly to Stop/Start your instance to ensure the snapshot is complete you will need to supply the Elastic IP address associated with your instance. This is optional, but recommended.

Access Key: This is the IAM user I suggested creating above.

After supplying the necessary instance information you can jump down to Backup Instance and Delete Backups.

Create your schedule.

Note there are macros you can use to name your snapshot. I use something like


I keep a weeks worth of snapshots. And because my instance has three EBS volumes, I set the Minimum to Keep at 21. That’s 3 volumes x 7 days = 21.


I think the pricing is incredibly reasonable. It costs $0.15 to create or delete a snapshot of each volume. That means I’m only spending $0.90/day to create three snapshots and delete three snapshots.

Scratching the Surface

Skeddly can do so much more. Once you get started, you may find yourself scheduling all sorts of tasks. Need to backup your RDS... Skeddly can do that. Create an AMI... let Skeddly handle it. Make a nice dinner... Skeddly can’t do that, but with all the time you’re saving why not put on your chef’s hat and prepare some grub.