JavaScript and unicode


JavaScript works well enough with unicode. It even supports using unicode as identifiers which is interesting to say the least. I sometimes use things like the look of disapproval when I have to write a horrible hack that needs to be undone at a later time.

var ಠ_ಠ = "Look of disapproval";


There is a great, potential evil in doing this however. Like finding characters that look like JavaScript symbols.

var ʔ̣,ǃ;


Okay so the question mark is a stretch but the exclamation point? Looks pretty damn good, right? Just don't use them in real code that needs to be maintained 😎

Msngr hit 5.0, gains LTS plus future developments

Anyone who knows me knows I can't shut up about msngr. I love it but then again I wrote it so of course I'm going to love it. It's a great eventing library, based on a pubsub development pattern, for JavaScript with little competition (the competition that exists is kinda lack luster in my opinion but I'm biased so of course I'm going to say that).

So I want to take it further. I want it to kick more ass than it is kicking today. So heres four announcements about it.

The 5.0 release has dropped!

How did you not notice that 5.0 is out!? The APIs are largely the same except validation took a drastic turn from how it worked prior. Now it's far more intuitive where you can simply call<type> and it will return a boolean (you can even do .there and .empty and it works on multiple types!). 5.0 also saw some major refactoring to aid in maintenance and future extensibility including a new middleware capability and a far, far more predictable and reliable merge() and copy().

Give it a shot!


It has a logo now! It's very simple but that works for it considering it's a very simple and straight forward library (when it's gzipped it's only 5.8kb which should go down further in a future update).

The msngr.js logo

Long Term Support (LTS)

A major lacking in msngr prior was long term support. Each major version saw API breakages, almost no bug fixes to prior versions, etc. Starting with 5.0 LTS will now be baked into the process. Each major version released will include 3 years of LTS (so bug and security fixes will get dropped into 5.0 even in 2019). This also means fewer major version releases as the API is fairly solidified at this point. The messaging API hasn't changed much in over a year and the rest of the APIs have slowly gravitated to where they are now in a fairly optimal form.

I'm going to work on mostly fixing issues and extending msngr in non-breaking ways to keep major versions to a minimum (something important in a depended upon project, in my opinion). New major versions will likely still come out once every 12-18 months, a far slower pace than before, so don't worry I won't hold back new and exciting developments!

Msngr will continue to target the same environments (node 0.10 and higher up to the current latest (all tests are run against all major versions of node)) and modern web browsers. Msngr is written with a minimum reliance on the newest browser APIs so it should run correctly even in IE7 though I have no desire to do any testing on anything below IE11.

Where'd the DOM support go and what's an optional framework?

I decided I couldn't hold up the 5.0 release any further and the browser specific features had to go as they had not yet been re-worked into my ultimate goal: creating an optional framework.

What the heck is an optional framework? Consider Angular, React and Ember. They are highly popular frameworks. But what if you want to write a component that works in all 3? What if you decide one framework is more optimal than the other? Now you either have to hook directly into the HTML, which can be awkward in the confines of a framework, or re-write it to work correctly with whatever framework you're transitioning to. This is no good.

I haven't finalized how it will work from a library delivery aspect but msngr will aim to become an optional framework. Write a component with msngr and as long as it understands how your current framework works, it can hook into it or ultimately just have it took directly into the DOM.

The optional framework aspect is in a very piece-meal, rough prototype but expect it to drop by the end of 2016 which will be able to hook into the existing msngr APIs (so it won't even require a new, major version to support!).

Why are you not using the node cluster API?

I'm in love with the simplicity and awesomeness of the node clustering API so I just had to get some words down about it.

What is the cluster API?

JavaScript is single threaded but makes heavy use of asynchronous development patterns to actually get stuff done. This is great but at some point you're going to run into a road block where you need to handle multiple things at once (like incoming HTTP requests). This is where the cluster api comes in.

It's useful to imagine the cluster api as a way to run multiple instances of your node.js application with a master instance that controls all of them.


I find it easier to explain such things through code so let's just look at an example of using the cluster api to run a server with a dynamic amount of instances.

(function () {
    "use strict";
    var cluster = require("cluster");

    // Amount of workers to use; defaults to 1 but can be specified via cli
    var workers = process.argv[2] || 1;

    // If we're the master let's create as many workers as specified
    // Also handle restarting failed workers.
    if (cluster.isMaster) {
        var forks = [];
        for (var i = 0; i < workers; ++i) {
            console.log("Initializing worker " + i);

        // When a worker exits this is likely abnormal; restart it and log it
        cluster.on("exit", function (worker, code, signal) {
            for (var i = 0; i < forks.length; ++i) {
                if (forks[i] == worker) {
                    console.log("Worker " + i + " has failed. Attempting to restart.");
                    forks[i] = cluster.fork();
    } else {
        // Start HTTP server


The above code is pretty simple, isn't it? It essentially replaces the need for bringing in a third party library just to run multiple instances and keep your server alive (though this has no fault tolerance for forking so it would be best to look at adding that in).

First it looks at the passed in parameters so when you run npm start 5 it will start with 5 workers otherwise it defaults to 1. Then it simply checks if it's master and, if not, forks the existing process and starts the web server.

What about a shared state or session?

Each worker created by fork() is a separate process so there is no sharing of state or really any objects between them. So if possible it's best to make each instance stateless. There is a way to handle messaging between each worker and the master, however. The cluster api provides simplistic sending and receiving of messages as shown in its documentation as:

if (cluster.isMaster) {
  var worker = cluster.fork();
  worker.send('hi there');

} else if (cluster.isWorker) {
  process.on('message', (msg) => {

As you can see it's pretty easy to send messages back and forth but I wouldn't use it to handle any type of shared state. If you need a shared cache / state use something like redis, memcache, etc.

Start today!

Naturally my explanation and code are all highly simplified but it shows you don't need much to get started. Take a look at the node cluster api documentation for a more in-depth look and start using it today!

Mache! Mache!

I love saying "mache"; it just sounds so cool.

Anyway, what the hell is mache? So anyone reading my site probably knows about msngr and it just so happens that I recently released msngr 4.0.0 which includes something called msngr.mache().

The basics

Mache is a caching solution for JavaScript that works on the server or web browser. Unlike typical caching systems this one is a merge cache (merge + cache = mache!). When key value pairs are set, instead of replacing old values, they're merged with any existing pairs. Let's just jump to a demonstration. [Run on JSFiddle]

var mache = msngr.mache();
mache.set("config", {
    host: "localhost",
    port: 3001

console.log(mache.get("config")); // prints {host: "localhost", port: 3001}

mache.set("config", {
    host: ""

console.log(mache.get("config")); // prints {host: "", port: 3001}

Neat, right? In fact msngr.config() is now backed by mache and uses all of the same APIs.


Merging is handy but that's not all mache can do. Mache also supports reverting merges. [Run on JSFiddle]

var mache = msngr.mache();
mache.set("UserProfile", {
    name: "Kris",
    age: 30

mache.set("UserProfile", {
    age: 29


console.log(mache.get("UserProfile")); // prints {name: "Kris", age: 30}

Revert too many times and the value goes back into nonexistence. Mache, by default, keeps 3 revisions of any piece of data. This can be changed based on optional parameters passed into the creation of a mache instance (full API reference).


Okay so we have merging and we have reverting but transactions are even more fun! This is my favorite part! [Run on JSFiddle]

var mache = msngr.mache({ emitChanges: true });
msngr("msngr.mache", "change", "UserProfile").on(function (payload) {
    console.log(payload.oldValue); // prints undefined
    console.log(payload.newValue); // prints {name: "Kris"}

mache.set("UserProfile", { name: "Jack" });
mache.set("UserProfile", { age: 97 });

mache.set("UserProfile", { name: "Kris" });

Yeah. That just happened. So we did several things here that I want to break down.

First, we created a new mache instance that specified we wanted to emitChanges. This means all committed changes are emitted via msngr where you are given a copy of the old and new values.

Second, we called .begin() to start a transaction, .rollback() to undo all actions to a mache instance, then later called .begin() and .commit() to start another transaction, make a change and commit it.

Cool, right!? Well I think so.

Digging it

There is one more thing I want to show and that's digging properties out, safely, from cached data. What the hell does that mean? Let me show you. [Run on JSFiddle]

var mache = msngr.mache();
mache.set("config", {
    my: {
        crazy: {
            deep: {
                value: 42

console.log(mache.getDeep("config", "my.crazy.deep.value")); // prints 42
console.log(mache.getDeep("config", "my.crazy.deep.other.value", 10)); // prints 10

So .getDeep() provides a way to get at a property's value without constant checks for undefined and / or null. Even handier is that you can specify a default value so if any part of the property's path doesn't exist then the default value will be returned. Handy!

Thanks for reading

If you got this far then you are awesome! Thanks for checking out something I did. Think it's cool or horrible? Please let me know! I love praise and criticism (how else does anyone learn?). I hope someone finds this as useful as I do.

An absolutist view on encryption

President Obama at SxSW

It has come to my attention that President Obama took a pit stop at the SxSW festival. He talked about private industry's roll in fixing and various other topics but the most interesting, to me, was the following:

“We have engaged the tech community aggressively, and what my conclusion is so far is that you cannot take an absolutist view on this,” he said. “If your argument is strong encryption no matter what … that I think does not strike the balance that we have lived with for 200, 300 years, and it’s fetishizing our phones above every other value. That can’t be the right answer.”

To someone without an understanding of encryption it sounds very well intended and open to discussion. It's welcoming in a "I don't know the answers but let's come to a compromise" way. This is a great view to have on many topics especially as a politician. This, however, is an unrealistic view once you understand encryption.

The absolutist view

Unfortunately there is no way to actually come to a balanced solution, technically speaking. You can support encryption in earnest, without compromise or you can compromise it. It's surprisingly binary when compared to most other political topics. There are two views here so let's break them down.

Uncompromising encryption

The side that your typical technology company is on is full encryption with proven algorithms and no alternative ways of decryption. This means an encrypted communications channel between your web browser and, say, Amazon cannot be decrypted mid-stream. Anything your browser tells Amazon, and vice versa, can only be seen by those two parties.

This is how it works today. Sure if you have computational power not realistically available in 2016 you could brute force your way into this encrypted stream one stream at a time but the amount of power and time required simply isn't feasible.

Granting access to a third party

The other side of the coin is that the government wants to be able to authorize legal requests for information which requires a third party's access to the encrypted data. They've had the ability to do that for over 200 years with letters sent through the mail system and, eventually, phone calls both of which typically require a legal request (aka warrant or, more recently, national security letter).

Granting access to encrypted data to a third party is usually proposed as an escrow key held by only specific government entities. There are suggestions of creating multiple escrows so one agency couldn't access the data alone. But are escrow keys even possible?

Turns out escrow keys are very possible and are even used in many organizations today for handling data that multiple users should have access to. However I'm honestly not sure how you do this without exposing the third party key. Take GnuPG or PGP for example; you encrypt the data with a randomly generated key which is then encrypted as many times as you need keys to be able to access the data sometimes even using a combination of keys (so more than 1 are required to unlock the key to the data). The trouble with this avenue is the third party key(s) would be exposed to whoever does the encryption.

There is only one answer here

Unlike the statement President Obama made during SxSW this is a rare, political issue that is very binary. An escrow key to allow third party access opens up multiple avenues of those key(s) falling into the wrong hands (whether it's a rogue government employee, a poor implementation of the encryption algorithm itself or even another nation conducting cyber attacks) and once one of these avenues are accessed it's game over. National security would be severely crippled. I can't stress enough how fucked our national security would be if using this type of encryption was required and an enemy state gained access to escrow key(s) or a means of generating them.

There is a very, very good reason the NSA sees encryption as vital to our national security. The door is already opened to encryption. Even if the United States voted, tomorrow, that all devices must use a form of encryption in which the government can access through an escrowed key system it would still ultimately not matter for the "bad guys". Encryption software has become significantly easier to write and use to the point that any terrorist cell could easily grab any one of thousands of open source packages for encryption and wire up their own communication protocol that does not have any compromises in it.

We still need to be sympathetic

This topic, while binary when understood in its majority, is still a politicized topic. It's become emotionally charged. It's even diverged into other territories with the San Bernardino shooting where the FBI got a court order to compel Apple to write a custom version of iOS that can only run in memory and allow them to bypass the restriction of wiping the memory should they guess the passcode too many times. I don't want this conversation about encryption to become muddled with the San Bernardino case as they're two very different things though it is still important just outside the scope of this critical topic.

Ultimately a large portion of folks in the justice system want to be able to do their job, to access a phone when court ordered and to catch the bad guys. As responsible citizens of the tech community we shouldn't immediately dismiss the wants and hopes of those who do not understand the nuances of encryption. Instead we need to educate. Explain how encryption works and why it has to work that way. Explain and maybe even sit down and show lawmakers how to write code that can take an off-the-shelf open source encryption library and wire it up to create your own, secure tunnel. It's not important that they understand the syntax that you are writing but that you can demonstrate how easy it would be to ignore any laws created governing encryption.

Many times when I don't understand something my immediate, snap reaction tends to stick. It takes time, patience and education to overcome such inherent biases. It's important to not expect results overnight but to empower our legislature to understand the criticalness of the issue of encryption. + SSL = Done

As a software engineer you'd think I would have started using SSL a long time ago. I mean what software engineer, who works on web applications half the time, doesn't even use SSL on their portfolio site!?

I'm digressing (as usual). This is just a small entry to mention that is now using SSL and will redirect you to SSL if you come to the site without SSL. You are now guaranteed to be getting the correct content delivered to your web browser from my website.


msngr 4.0.0 is out! Speed, tweaks and mache!

I just published 4.0. This version brings performance improvements (about 20%! Oh yeah!), redone ID system ( no longer returns a UUID but msngr.uuid() now does) and a new cache object called mache.

What's msngr.mache()?
Mache is a merge cache. Data stored into it is merged with existing data instead of simply being replaced. It also supports revisions (allowing you to revert data to previous states) and even transactions! It also powered the singleton msngr.config() now for configuring msngr and other items.

What else changed?
Repeating a change log here would be boring so read all the details here.

Stupid JavaScript tricks: multiline strings!

Multiline strings in JavaScript are not a thing you say? Impossible you say? Well here's a stupid JavaScript trick :)

var content = function() {
    This is a multiline comment.
    Can this be turned into a string?
    Surely it can not, right?
    What's wrong with you, JavaScript!?
var funcStr = content.toString();
var commentStart = funcStr.indexOf("/*");
var commentEnd = funcStr.lastIndexOf("*/");
alert(funcStr.substring(commentStart + 2, commentEnd));

What shows up in the alert? Well, see for yourself!

Now I wouldn't suggest actually doing it this way but it's a stupid JavaScript trick, right? So why does it work? Why would comments be included in the content of a function and why would there even be a function to string conversion? Well, it's actually part of the ECMAScript standard and it can be quite useful even if a bit hacky! You'll typically find string representations of functions used as object keys for eventing systems (in fact, msngr does this!) which facilitates n number of methods per event that can be removed by simply passing in the function to remove.

Using msngr.config for application configuration

Now that msngr.js 3.0 is out it comes with a bunch of cool, new utilities that I'm going to cover in a serious of posts. Today let's talk about msngr.config().

So as part of setting up configurations for every node.js project I create I always set it up so that there is a set of default configurations followed by a configuration file for each environment. I then take those files and merge them together with the environment file winning. Seems simple enough, right? Except I've rewritten this logic at least a dozen times.

While msngr.config() is used internally to handle pseudo constants that can be altered by developers when necessary, it can also be used to facilitate the type of configuration setup I describe above. How? Let's dive into some code.

msngr.config("server", {
    host: "localhost",
    port: "3000",
    gzip: true

Okay so now we have a simple server configuration. Now what? Well let's consider the above our default configuration. So now let's apply a specific environment's configuration!

msngr.config("server", {
    host: ""

Now the new host will overwrite "localhost" but port and gzip will remain the same!

console.log(msngr.config("server").host); // Prints ""
console.log(msngr.config("server").port); // Prints "3000"
console.log(msngr.config("server").gzip); // Prints true

Simple, right? The best part is the way msngr.js caches data makes this configuration available to everyone in the same node.js process after only requiring the msngr module.

How to make web requests in JavaScript using the same code across environments

When I set out to work on msngr.js 3.0 there were a few projects I wanted to integrate with to make msngr.js really awesome. Unfortunately the integration points for many projects (Bakula being one example) were http / https based. Guess what JavaScript sucks at doing consistently across environments? Yeah, you guessed it.

So how do we make http / https calls, using the same code, regardless of whether we're inside node.js or a web browser? Why I created an abstraction just for this with msngr.js (you're surprised, right?)! Let's take a look at a simple GET request for some JSON.

var net ="http://localhost:3000");
    path: "/"
}, function(err, result) {

The net object provides a very, very simple way of executing the usual HTTP verbs (GET, POST, PUT, DELETE and OPTIONS). I could have worked to provide a very complete net object but, to be honest, that would have been a pain in the ass. Instead, provides a handy way to conduct requests for JSON and / or plain text (so no multipart support).

So if we wanted to query a GET endpoint with a series of query strings it might look something like this:

var net ="http://localhost:3000");
    path: "/search",
    query: {
        term: "search topic",
        sort: "DESC"
}, function(err, result) {

What about data creation / sending?

var net ="http://localhost:3000");{
    path: "/users",
    payload: {
        username: "kris",
        email: "[email protected]"
}, function(err, result) {

That's essentially it! The method will accept protocol, host and port as separate pieces, all together or you can even omit some parameters and it'll still do its best to figure out what you fucked up.

Now that msngr.js has this capability look for updates in the near future where it integrates with multiple systems plus future support for more protocols (web sockets and unix sockets) and long polling in standard http / https requests.