colorful diffs for javascript objects with difflet [ 2012-02-22 14:30:32 UTC ]

Introducing difflet, a handy node.js module for computing pretty object diffs!

Just plug in some initial options and the objects you want to compare!

var difflet = require('difflet'); var s = difflet({ indent : 2, comment : true }).compare( { z : [6,7], a : 'abcdefgh', b : [ 31, 'xxt' ] }, { x : 5, a : 'abdcefg', b : [ 51, 'xxs' ] } ); console.log(s);

and you'll get a colored and annotated object expressing the differences between the 2 objects you passed in:

spacer

Deleted elements between the first and second object are shown in red and commented if comments are turned on. New object show up in green and updated objects show in blue. The comments show what the previous value was, if any.

You can set a bunch of options to adjust the formatting including comma-first output. You can even generate HTML output:

var difflet = require('difflet'); var ent = require('ent'); var tags = { inserted : '<span>', updated : '<span>', deleted : '<span>', }; var diff = difflet({ start : function (t, s) { s.write(tags[t]); }, stop : function (t, s) { s.write('</span>'); }, write : function (buf, s) { s.write(ent.encode(buf)) }, }); var prev = { yy : 6, zz : 5, a : [1,2,3], fn : function qq () {} }; var next = { a : [ 1, 2, 3, [4], "z", /beep/, new Buffer([0,1,2]) ], fn : 'I <3 robots', b : [5,6,7] }; diff(prev, next).pipe(process.stdout);

which generates some HTML output that you can stuff into a browser:

$ node example/html.js {&quot;a&quot;:[1,2,3,<span>[4]</span>,<span>&quot;z&quot;</span>,<span>/beep/</span>,<span>&lt;Buffer 00 01 02&gt;</span>],&quot;fn&quot;:<span>&quot;I &lt;3 robots&quot;</span>,<span>&quot;b&quot;:[5,6,7]</span>,<span>&quot;yy&quot;:6,&quot;zz&quot;:5</span>}

Plus, pkrumins and I just rolled out difflet output for testling to make debugging t.deepEqual() statements on big objects easier:

spacer

node-tap is now using difflet too in 0.2.1!

spacer

Check out the code on github or with npm do:

npm install difflet
semver your services with seaport [ 2012-02-09 11:05:34 UTC ]

Pushing out incremental changes to a service-oriented cluster can be tricky, especially when changes span multiple services. Introducing seaport, a service registry written in node.js based on semvers.

With seaport, services are brought up with a name@version string and other processes can connect to services that match a name@semver pattern.

First spin up a new seaport server on a port:

$ seaport 9090 --secret='beep boop'

Here's an example seaport service that registers a new service web@1.2.3:

var seaport = require('seaport'); var ports = seaport.connect('localhost', 9090, { secret : 'beep boop' }); var http = require('http'); var server = http.createServer(function (req, res) { res.end('beep boop\r\n'); }); ports.service('web@1.2.3', function (port, ready) { server.listen(port, ready); });

and here's some code that connects to the web@1.2.3 service:

var seaport = require('seaport'); var ports = seaport.connect('localhost', 9090, { secret : 'beep boop' }); var request = require('request'); ports.get('web@1.2.x', function (ps) { var u = '' + ps[0].host + ':' + ps[0].port; request(u).pipe(process.stdout); });

Running the last script prints out beep boop from the web@1.2.3 http server.

You can create seaport servers programatically. This is useful for integrating seaport with an http host router. Here's an example using bouncy:

var seaport = require('seaport'); var ports = seaport.createServer().listen(5001); var bouncy = require('bouncy'); bouncy(function (req, bounce) { var domains = (req.headers.host || '').split('.'); var service = 'http@' + ({ unstable : '0.1.x', stable : '0.0.x', }[domains[0]] || '0.0.x'); var ps = ports.query(service); if (ps.length === 0) { var res = bounce.respond(); res.end('service not available\n'); } else { bounce(ps[0].host, ps[0].port); } }).listen(5000);

We can then spin up http services for the semvers 0.1.x and 0.0.x:

var seaport = require('seaport'); var ports = seaport.connect('localhost', 5001); var http = require('http'); var server = http.createServer(function (req, res) { res.end('version 0.0.0\r\n'); }); ports.service('http@0.0.0', function (port, ready) { server.listen(port, ready); });

and the other server...

var seaport = require('seaport'); var ports = seaport.connect('localhost', 5001); var http = require('http'); var server = http.createServer(function (req, res) { res.end('version 0.1.0\r\n'); }); ports.service('http@0.1.0', function (port, ready) { server.listen(port, ready); });

Now we can dynamically route to semvers based on the http host header!

$ curl -H host:stable.localhost localhost:5000 version 0.0.0 $ curl -H host:unstable.localhost localhost:5000 version 0.1.0

These http servers could themselves have dependencies on other services in the cluster. By using seaport, new code can be pushed out that spans multiple servers with very explicit backwards compatability that tolerates and encourages multiple versions of services to satisfy dependencies. The easier it is to push out crazy new experiments, the more often it will happen!

Seaport is a component for these hypothetical cluster commands that would make continuous deployment for clusters super simple.

Check out the seaport source on github or npm install seaport!

spacer
roll your own test runner for testling [ 2012-01-23 11:23:13 UTC ]

Testling is a browserling product that pkrumins and I put together to make running automated browser tests super easy.

Tests usually look something like this:

var test = require('testling'); test('json parse', function (t) { t.deepEqual( Object.keys({ a : 1, b : 2 }), [ 'a', 'b' ] ); t.end(); });

Then you can run the tests on all the browsers we run using a curl one-liner like this one:

curl -sSNT test.js -u mail@substack.net \ 'testling.com/?browsers=chrome/16.0,firefox/9.0,safari/5.1,ie/9.0,ie/7.0'

and when the code blows up (in IE7 in this case because it doesn't have Object.keys), you get a full stack trace!

$ curl -sSNT test.js -u mail@substack.net \ 'testling.com/?browsers=chrome/16.0,firefox/9.0,safari/5.1,ie/9.0,ie/7.0' Enter host password for user 'mail@substack.net': Bundling... done chrome/16.0 1/1 100 % ok firefox/9.0 1/1 100 % ok safari/5.1 1/1 100 % ok iexplore/9.0 1/1 100 % ok iexplore/7.0 0/1 0 % ok Error: [object Error] at [anonymous]() in /test.js : line: 4, column: 5 at keys() in /test.js : line: 5, column: 9 at [anonymous]() in /test.js : line: 3, column: 29 at test() in /test.js : line: 3, column: 1 > t.deepEqual( > Object.keys({ a : 1, b : 2 }), > [ 'a', 'b' ] > ); total 4/5 80 % ok

Wow super great! Except perhaps you don't like how the standard test API looks and want something more jasmine-esque and bdd-ish.

Just write a little wrapper like this:

var testling = require('testling'); module.exports = function describe (dname, cb) { if (typeof dname === 'function') { dcb = dname; dname = undefined; } var ix = 0; cb(function it (iname, cb) { var name = (dname ? dname + ' :: ' : '') + (iname || 'test #' + ix++); testling(name, function (t) { var waiting = false; t.wait = function () { waiting = true }; cb(t); if (!waiting) t.end(); }); }); };

And now you can write your tests like so, require()ing the bdd wrapper node.js-style:

var describe = require('./bdd'); describe('arrays', function (it) { it('should map', function (t) { t.deepEqual( [ 97, 98, 99 ].map(function (c) { return String.fromCharCode(c) }), [ 'a', 'b', 'c' ] ); }); it('can do indexOf', function (t) { t.equal([ 'a', 'b', 'c', 'd' ].indexOf('c'), 2); }); }); describe('tests', function (it) { it('can wait', function (t) { t.wait(); var start = Date.now(); setTimeout(function () { var elapsed = Date.now() - start; t.log(elapsed); t.ok(elapsed >= 100); t.end(); }, 100); }); });

Now you have 2 files but you can still use a one-liner to stitch everything together:

$ tar -cf- bdd.js test.js | curl -sSNT- -u mail@substack.net \ 'testling.com/?browsers=chrome/16.0,firefox/9.0,iexplore/9.0' Enter host password for user 'mail@substack.net': Bundling... done chrome/16.0 3/3 100 % ok Log: 101 firefox/9.0 3/3 100 % ok Log: 115 iexplore/9.0 2/3 66 % ok Log: 83 Assertion Error: not ok at [anonymous]() in /test.js : line: 24, column: 13 at ok() in /test.js : line: 24, column: 13 at [anonymous]() in /test.js : line: 21, column: 29 at setTimeout() in /test.js : line: 21, column: 9 at [anonymous]() in /test.js : line: 17, column: 29 at it() in /test.js : line: 17, column: 5 at [anonymous]() in /test.js : line: 16, column: 28 > t.ok(elapsed >= 100); total 8/9 88 % ok

...and strangely enough, IE9 only sleeps for 83 milliseconds when you tell it to sleep for 100! TYPICAL.

spacer

Check out the testling documentation, create a browserling account to use with testling, and hack up some crazy browser tests and test runners!

spacer

the node.js aesthetic [ 2011-11-30 13:51:21 UTC ]

I would like to document an emerging set of programming conventions, philosophies, and values that I see evolving in the node.js community. I call this the node aesthetic.

callback austerity

The very first example of node you are likely to see is on the node.js home page. This snippet is exemplary of the the radical simplicity pioneered by projects like sinatra.

var http = require('http'); http.createServer(function (req, res) { res.writeHead(200, {'Content-Type': 'text/plain'}); res.end('Hello World\n'); }).listen(1337, "127.0.0.1"); console.log('Server running at 127.0.0.1:1337/');

Instead of the http server being an external service that we configure to run our code, it becomes just another tool in our arsenal. Want to spin up a second http server on a different port? Super easy: just call http.createServer() once more. Want to print out how many requests per second you're averaging every minute? Just plop down a counter and a setInterval(). Node makes what was previously merely configurable into something more programmable, orthogonal, and more powerful.

A big part of what empowers node to make these kinds of interfaces possible is its asynchronous nature. No longer do you have statelessness imposed from on high by the likes of apache or rails. You can keep intermediate state around in memory just like in any ordinary program. You don't need to reason about multiple instruction pointers or mutexes or re-entrant, interruptible execution because all the javascript you write just lives in a single thread.

Depending on the event system to already just be there means that the end-users of modules don't have to think about what hoops they need to jump through in order to get the event reactor up, running, and plugged into the component they're trying to use. The event machinery is all running quietly under the hood without getting in the way or making a fuss.

limited surface area

Note also that the http snippet from earlier isn't inheriting from an http.Server base class or anything of the sort. The primary focus of most node modules is on using, not extending. In classical object-oriented design, it's customary to define specific custom functionality by extending more abstract classes. This idea flows pretty naturally from thinking about how you might write a compiler to efficiently pack collections of related properties into memory blocks, but is not such a great approach for writing usable and perhaps more importantly, re-usable interfaces.

Simple function calls with callback arguments have the superficial but important benefit of having less exposed surface area and more encapsulation than most classical designs. But moreover, when you extend a base class, your class is somewhat stuck with the interface and internals that the base class uses. If you want to make things prettier you'll need to shave some yaks to write wrapper classes just to get around the restrictions of the class system. Sometimes this classical style of reusability is appropriate, but in node it is the exception, not the rule, because the implicit consensus is that the interface is paramount, so usability should trump extensibility.

A big part of what makes node modules so great is how they tend to have really obvious entry points as a consequence of focusing on usability and limited surface area. Many modules on npm export only a single function by assigning onto module.exports such that require('modulename') returns just that single function of interest. It also helps that module names have a direct correspondence to the string that you put in your call to require() and that require() just returns the module instead of modifying the environment to include the module. This is known in some circles as doing a qualified import. In node you can't even do any other kind of import without getting really hackish about it and this is a very good thing. This import style means that when you fire up the REPL to play with a new module when you do require('modulename') you can see exactly what functionality 'modulename' provides. Even more importantly, it means that when you read through some new foreign piece of code that uses many modules, you can tell exactly where all the module exports came from.

batteries not included

If software ecosystems are like economies then core libraries are at best government bureaucracies and at worst nationalized state corporations. That is, modules in the core distribution get an unfair advantage over the libraries in userspace by virtue of availability and prominance. Worse still, because these modules are all maintained in the core project, contributing, experimenting, and iterating is much harder than with a typical library on npm. This is especially compounded by the fact that core libraries are pegged to the core project version and don't often have independent versioning.

Core distributions with too many modules result in neglected code that can't make meaningful changes without breaking everything. If a neglected library lives in userspace then at least people can ignore it more easily and the independent versioning makes necessary breaking changes easier to mitigate for legacy code.

The "batteries included" approach may have struck a worthwhile tradeoff back when package managers were primitive and modules were global, but that advantage evaporates in the face of baked-in concurrent library versioning and sophisticated package management.

radical reusability

While the limited surface area approach can hurt extensibility, you can win a great deal of extensibility back by breaking up problems into lots of tiny modules. When you write a module for general enough consumption to put up on npm, you can't contaminate your own implementation with too many implementation-specific details. It's also much easier to test and iterate on code in complete isolation from your application business logic. Node and npm go to great lengths to help you do this.

In node the only modules that can be said to be "global" are compiled into the binary itself. There is not really such a thing as a global, system-level module anymore. Instead, when you run a script that performs a require(), node will search for the module in a "node_modules" directory from the directory that the script is in all the way down to "/", stopping at the nearest "node_modules" directory. Similarly, when you do an npm install, node will place the modules into the nearest "node_modules" directory or "./node_modules".

All of this preferred locality in the module loader has the very important practical upshot in that you can depend on different versions of modules in different places in your program's dependency graph and node will pick the most local dependency for each module. So if a module "foo" was tested against and depends on "baz@0.1.x" and a module "bar" depends on "baz@0.2.x", then when you go to use both "foo" and "bar" in your own program, "foo" and "bar" will both use the version of "baz" that they were tested and depend against!

This approach to versioning means that nearly all modules should be interoperable with each other. It also means that module authors can be more aggressive with iterating and pushing out backwards-incompatible breaking changes to their APIs, so long as the authors are careful to increment the module version appropriately. Concurrent versioning and locality preference drastically accelerate the rate that we can iterate on new experiments and the levels of reuse that we can attain both in a single large project and among projects.

the future

I am wildly optimistic about where this emerging aesthetic of radical reusability and module-driven development will take node and programming in general.


bounce HTTP requests around with bouncy [ 2011-10-11 09:13:30 UTC ]

Introducing bouncy, a websocket and https-capable http router proxy / load balancer in node.js!

It's super simple to use, just call bounce()!

var bouncy = require('bouncy'); bouncy(function (req, bounce) { if (req.headers.host === 'bouncy.example.com') { bounce(8000); } else if (req.headers.host === 'trampoline.example.com') { bounce(8001) } }).listen(80);

You can bounce() to a port, a host, port, or a stream you open yourself.

Since bouncy is just parsing the http headers and sending along the raw tcp stream, you can use websockets on the place you bounce() to without writing any special code!

Bouncy uses node's delicious http parsing innards, so the req object you get is a bona-fide http.ServerRequest with all the fixins.

Plus, bouncy comes with a simple command-line tool if you have a static routing table kicking around in a json file. Just throw a routes.json like this one:

{ "beep.example.com" : 8000, "boop.example.com" : 8001 }

at the bouncy command and give it a port to listen on:

bouncy routes.json 80

Super easy! Check out the code on github or with npm do:

npm install bouncy

to install the library or

npm install -g bouncy

to install the command-line tool.

spacer
gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.