Metamarkets Blog » Blog Archive » Node.js and the Javascript Age

Metamarkets Blog » Blog Archive » Node.js and the Javascript Age

2011-04-26. Category & Tags: Others Others

Three months ago, we decided to tear down the framework we were using for our dashboard, Python’s Django, and rebuild it entirely in server-side Javascript, using node.js. (If there is ever a time in a start-ups life to remodel parts of your infrastructure, it’s early on, when your range of motion is highest.)

This decision was driven by a realization: the LAMP stack is dead. In the two decades since its birth, there have been fundamental shifts in the web’s make-up of content, protocols, servers, and clients. Together, these mark three ages of the web:

**I. 1991-1999: The HTML Age. **

The HTML Age was about documents, true to Tim Berners-Lee’s original vision of a “big, virtual documentation system in the sky.” The web was dominated by static, hand-coded files, which web clients crudely formatted (with defaults that offend even the mildest of typographiles). Static documents were served to static clients.

II. 2000-2009: The LAMP Age.

The LAMP Age was about databases. Rather than documents, the dominant web stacks were LAMP or LAMP-like. Whether CGI, PHP, Ruby on Rails, or Django, the dominant pattern was populating an HTML template with database values. Content was dynamic server-side, but still static client-side.

III. 2010-??: The Javascript Age.

The Javascript age is about event streams. Modern web pages are not pages, they are event-driven applications through which information moves. The core content vessel of the web — the document object model — still exists, but not as HTML markup. The DOM is an in-memory, efficiently-encoded data structure generated by Javascript.

LAMP architectures are dead because few web applications want to ship full payloads of markup to the client in response to a small event; they want to update just a fragment of the DOM, using Javascript. AJAX achieved this, but when your server-side LAMP templates are 10% HTML and 90% Javascript, it’s clear that you’re doing it wrong.

To recognize this means shifting our view of the server from a document courier (HTML Age), or a template renderer (LAMP Age), to a function and data shipper. The principal role of the server is to ship an application to the client (Javascript), along with data (JSON), and let the client weave those into a DOM.

The secondary role of the server is to listen in on a stream for events (a new edit, a message, or ticker change) and efficiently push responses back to clients.

For both of these roles, node.js is an ideal serving architecture. Since we’re currying Javascript functions on the server-side, we ought to write in Javascript. We can shift computation from server to client with little impedance (for example, we no longer need to maintain two sets of our custom string formatting libraries).

With regard to responding to event streams, node.js is ideal. Its asynchronous, non-blocking architecture means it’s incredibly fast. It uses HTTP 1.1, keeps its connections open, and a single server can handle thousands of open connections at once.

Finally, it’s worth considering that events are simply packets of data, and the emerging lingua franca of data on the web is JSON. This is what client-side applications receive when a ticker moves, or a message arrives. This is, again, a native format for node.js.

The Javascript age brings us closer to a web that is not a global digital library, but a global digital nervous system, whose implications we are only beginning to apprehend.

#

	Posted  
			on Friday, April 8th, 2011 at 5:39 pm		  

Filed under technology.
You can follow any responses to this entry through the RSS 2.0 feed.

				You can leave a response, or [trackback](http://metamarketsgroup.com/blog/node-js-and-the-javascript-age/trackback/) from your own site.

27 Responses to “Node.js and the Javascript Age” #

  1. jorjun Says:
    April 8th, 2011 at 10:48 pm

    I for one welcome our javascript overlords But seriously, at the current rate of change, I think JS will be obsoleted within 2 years by a more productive means of encoding valuable IP. Watch your back, the young ones are coming thru, and Java means nothing to them – they weren’t around in the late 1990s. Javascript is a silly name for a silly language. With curly brackets geek fudgery intensely annoying artefacts, to old-skoolers like me Javascript looks hasty, nasty far too easy to obfuscate. You can’t read the thing. It’s not so bad if you are iterating through a design idea which has a short shelf-life, but for encoding corporate assets? Nah. JS is far too spongy.

  2. Johnny Fuchs Says:
    April 8th, 2011 at 11:32 pm

    “a web that is not a global digital library, but a global digital nervous system” – sexiest line ever.

    There’s a lot of excitement for node.js, and a lot of developers looking to build it into (or replace) their current backends. What advise do you have for other developers looking to build on node.js, but without the resources to make a javascript version of drupal/wordpress/django? Are you aware of any open source javascript CMS projects in the works?

  3. Laurence Dawson Says:
    April 8th, 2011 at 11:43 pm

    @johnny What js features would you want to see in an open source CMS?

  4. Bruce Atherton Says:
    April 8th, 2011 at 11:53 pm

    I agree with you that applications are becoming more and more event-driven, but I disagree that the future involves doing that through HTTP. Look for Websockets and SPDY to take over the world in that regard, as they are infinitely better suited to the task than HTTP is.

    Javascript, OTOH, shows no sign of going away as long as browsers include it as the only common default language. Not that it can’t be changed, but I don’t see it happening any time soon.

  5. Chase Sechrist Says:
    April 9th, 2011 at 3:54 am

    The one thing that worries me about Node.JS is that even though you’re locked in this jail with no easily available blocking operations, you still need to know (arguably advanced) knowledge regarding how to debug race conditions and how an event loop works (and even how a call stack works due to recursive callbacks smashing the stack). Because of that, the control flow is very strange and mind-bending to people that have been writing C for 20 years, and even junior engineers that are just getting into programming (unless they are web-focused engineers that are used to jQuery-esque continuation passing style). This is pretty much proven by all of the arguments and libraries popping out of the woodworks to deal with control flow in some elegant way (fibers, chaining, etc).

    That being said, I use Node.JS for all projects that I can, and I constantly bombard people with why asynchronous programming is the way to go for performance-critical pieces of architecture that deal with a ton of concurrent connections or event streams, but that doesn’t mean that everything can be done in Javascript.

  6. Josh Nursing Says:
    April 9th, 2011 at 4:22 am

    Very astute observations throughout the article and also from Jorjun. The future will belong to JS, but also to coffee-script and tools built around these. The advantages on a single mental context for developers cannot be underestimated. I do not, however, love JS’s syntax.

  7. Avi Bryant’s Presentations « Redtexture Says:
    April 9th, 2011 at 7:02 am

    […] applications in The Truthiness Is Out There (March 20, 2011). (September 2010) Mike Driscoll Node.js and the Javascript Age (April 8, […]

  8. Johnny Fuchs Says:
    April 9th, 2011 at 7:53 am

    @Laurence Something simple to get the mainstream comfortable doing their backend in javascript. Like a basic blog or photo album. I can see serving single entries instead of a whole rss feed being a neat way to share content.

  9. Oleg Says:
    April 9th, 2011 at 1:01 pm

    @Chase This is exactly the problem we’re trying to address with Akshell (http://www.akshell.com), which unlike Node uses synchronous I/O with an Apache MPM like setup, also running on V8.

    For real time communications and server push using WebSockets, we’re looking to add a Node pubsub component to our platform, which will communicate with the synchronous server. Unless you’re streaming binary media, most interactions are actually not based around streams, but rather discrete events, which can be easily processed in a synchronous manner within the scope of a single HTTP request.

  10. Horia Dragomir Says:
    April 9th, 2011 at 1:31 pm

I’ll be back to comment after I’ve read and chewed the full article, but I must say this:

I don’t know what you were doing in 2007, but I was AJAX-ing my ass off   and at a job interview in late 2007, when Flash came up, the consensus at the small firm I was applying at was that what you could do with Flash you can also do with JS. And that was a small (but forward thinking, I guess) web design shop in the puny town of Timișoara, Romania.

The Age of Javascript began a long time ago (DHTML, anyone?) but it’s just turned *cool*

Kinda like a new Smalltalk, or Rails.

And don’t get me wrong, I’m a UI Developer, so I do like Javascript  
  1. Jim Wooley Says:
    April 10th, 2011 at 1:42 am
One of the issues with Javascript is that the developer tooling and debugging experience is still lacking. Also the dynamic type system and runtime evaluation lead to greater chances of bugs creeping into your codebase. Case in point: I find it ironic that this page which touts the supremacy of the JS platform actually has a JS error in it. Sure, we’ve come a long way, but let’s try to use the right tool for the right job at the right time.
  1. NodeJS与Javascript时代 _ CSSer Says:
    April 10th, 2011 at 2:21 pm
[...] Javascript时代带给我们的更像是一个网络,而不仅仅是一个全球性的数字图书馆,它更像一个全球性的数字神经系统,这种影响才刚刚开始。你是否也这样认为? 原文(e):http://metamarketsgroup.com/blog/node-js-and-the-javascript-age/ [...]
  1. Asher Snyder Says:
    April 10th, 2011 at 8:45 pm
[Reposted from Hacker News]

While I agree with the premise that the web should and is moving towards events. I think the author mistakenly anchors event-based development to JavaScript and NodeJS.

Since 2005 (available to the public in 2008), we’ve (along with numerous other platforms) been offering event-based websites and WebApps, via PHP with NOLOH ([http://www.noloh.com](http://www.noloh.com)). In NOLOH, you the developer worry about writing your application, what it should do, how it should look, etc. and specify events on things, such as click, or your own custom events, and NOLOH takes care of the rest. In the case of a JavaScript enabled client, it’ll render only the necessary JavaScript and DOM elements for that device at that given point in time and handle all the AJAX (and Comet where applicable), for search engines it will generate a semantically rich and standards compliant HTML version along with links to normally “non-crawlable” content. Similarly for text-based browsers, or non-JavaScript clients it’ll output HTML. Furthermore, in the case of mobile devices, or slate devices, it’ll output the correct code so that your app along with its clicks and drags still work. In all the above cases, the developer writes no JavaScript, nor does he even need to know that the client exists, he simply writes his app and it works. He/She can of course still use JavaScript if they please, but in most cases it’s unnecessary, unless you’re dealing with legacy code, or existing JavaScript widgets, and even in those cases, we abstract it out so that you can bind and sync without getting down into the nitty gritty.

So I don’t believe the premise that we’re in the JavaScript age is correct, as in many cases, you don’t even render JavaScript, however, I do believe that the premise that we’re in an event-based age is correct, I would even go so far as to suggest we’re heading towards a platform or unified language age, as that’s really the only way the craziness of the web is really manageable for rapid development, whether it’s JavaScript, NOLOH in PHP, or something else, and I thought so in 2005.

Disclaimer: I’m a co-founder of NOLOH
  1. CM Says:
    April 10th, 2011 at 11:44 pm
Agree and, to whoever disagrees with the language itself, don’t forget that a big part of the JS-stack success could be CoffeeScript.
  1. tiffany Says:
    April 11th, 2011 at 3:25 pm
I’m still trying to grok Node.js, but I do agree that we’re in the JavaScript Age. We will still need data stores, but with CouchDB, MongoDB, and the other NoSQL DBs, that’s coming in the form of JSON objects. It’s a fascinating time to be a JS developer.
  1. foo Says:
    April 11th, 2011 at 6:45 pm
“the emerging lingua franca of data on the web is JSON”

Ten years ago we heard the same about XML. Unimpressed.
  1. Richard Clayton Says:
    April 12th, 2011 at 2:18 am
Interesting premise.  I agree that the architectures in general are becoming more event-based, but I wouldn’t say the back-end dominated by Java, .NET and LAMP infrastructures are going to vaporize.  Its probably more accurate to say that a lot of the presentation layer is going to be (rightfully) offloaded onto the client, and JavaScript (plus the capabilities promised with HTML5) will make browser-based applications much more powerful.  The trend is toward exposing data services on the back-end, vice manipulating the view.  We will see the true JavaScript age when we see classic Enterprise Components (messaging, ESB’s, etc) becoming “dynamicized” by scripting languages.
  1. JB Says:
    April 12th, 2011 at 5:08 am
I would argue that coffeescript is not nearly as significant, in the rise of the javascript stack, as the significant performance gains realized by the development work being done by Google, with V8, and other browser makers with their related products. Additionally PHP is not now, nor will it soon become, a long term leader in any space as it continues to show its age and significantly limited performance. Javascript in general has been rising due to the likes of Douglas Crockford, and anyone willing to teach others more about the functional and prototypal aspects, and less about the classical baggage, of the language.

Functional programming is becoming the new paradigm, and while it does not work efficiently in all problem domains, it is becoming more relevant in this multi-core cpu era.

With regard to XML, why would anyone want to have to serialize and deserialize data with reams of tag soup when JSON makes this far less tedious?  XML is a poor mans RPC and RPC is still the model people are trying to emulate. JSON just makes more sense when Javascript is the language being used to put data on the wire.

Javascript may be in its heyday, but until something better comes along, it will be the goto language of web development. Now if only we can get those people writing spec docs to stop writing LAMP everywhere. Except for Linux, the rest of that stack is becoming really stale.
  1. Ant Kutschera Says:
    April 12th, 2011 at 7:34 am
Like a few of you have already said, there is nothing wrong with an event based programming model.  In many cases it is better than the current multithreaded programming model typically used in web servers.  But that point has little to do with JavaScript.

I cannot take anyone who wants to write enterprise applications in JavaScript seriously.  The tools are useless.  The language is not suitable, for example type safety is lacking and you spend most of your time in a debugger which doesn’t have simple functionality like inspection or modification.  Who has seriously developed complex libraries involving several team, using Javascript, and had a good experience?

Why are people so afraid to stick with existing proven technologies, be they .NET, C++ or Java?  Yes, they are hard to learn and need investment.  But the result is professional and well supported systems.  All of those languages can do what Node.JS does.  And, they all support a better tools.
  1. johans Says:
    April 12th, 2011 at 8:54 am
@Johnny Fuchs – there are some Node blogs on GitHub. Also Express is the defacto web framework for Node and includes a blog example.
  1. Chris Says:
    April 12th, 2011 at 9:38 am
“Watch your back, the young ones are coming thru…”

And in the 80′s and 90′s I was one of the then young ones trying to tackle the first wave establishment with our radical managed language stacks, crazy haircuts, USR 14.4′s, cappuccino’s, Eddie Vedder wardrobes and open source heresy. Crash and Burn, RISC is good.

Both sides emerged battle scarred, and stronger, from those early encounters and dot-com bubbles. But despite the Second Wave establishing a foothold we all seem to still be living in a world that is populated with mainframes and blue chips.

So if any readers are planning to be part of a Third Wave Revolution attempting to overthrow both the Mainframers and the Stackers combined, then heed some former-young-one style veteran advice:

a) Don’t ever underestimate your competitors ability to innovate with their technology;

b) Don’t ever rely solely on a new piece of technology to carry you through.

To reference Samuel Clemens: History may not repeat itself, but it sure does rhyme a lot.
  1. Patrick Says:
    April 12th, 2011 at 7:50 pm
This is old news!

If you were designing new page at a time webapps using lamp even back in 2008 then you were behind the times. We’re way beyond AJAX, DHTML, and JSON. Now is the time of client side persistent application logic and local storage with asynchronous/synchronous (as appropriate) data sync over HTTP.
  1. Chris Says:
    April 13th, 2011 at 9:03 am
“Now is the time of client side persistent application logic and local storage with asynchronous/synchronous…data sync over HTTP”

Didn’t we use to call those Desktop Apps using Web Services?

I’m sure we bought Desktop Apps from a Generic Desktop App Store back then too. Or shouldn’t I have said that in case Cupertino are listening?
  1. Marc Fasel Says:
    April 15th, 2011 at 1:20 am
JavaScript is a powerful language, and there is no reason JavaScript cannot be successful on the back-end. It just needs to mature to allow building large projects. JavaScript is not (yet) suited for “programming in the large”: It doesn’t have modules, there is no visibility modifiers, and because of it’s dynamic nature there is no proper IDE support (code completion, refactoring, etc). All that is out there right now are workarounds for these problems.
  1. Oleg Says:
    April 16th, 2011 at 2:10 am
@Marc What’s wrong with CommonJS modules ([http://www.commonjs.org/specs/modules/1.0/)?](http://www.commonjs.org/specs/modules/1.0/)?) All major server side JavaScript platforms have standardized on them, making JS only code that doesn’t rely on core libraries portable across the different engines.

On your point of IDE support, IntelliJ IDEA does a pretty good job of doing completion and refactoring. We’re also working on that within our browser based IDE at [http://www.akshell.com/ide/](http://www.akshell.com/ide/)
  1. Paul Says:
    April 19th, 2011 at 6:53 am
So, the author makes some valid points, but he suffers from the zeal of a recently-converted zealot and the tendency to massively overstate the truth.

LAMP, and other server-side programming architectures (yes, there’s more than one) is not ‘dead’ overnight just because someone’s decided it might be. There are millions of existing applications out there, still being developed and maintained in languages such as Java, PHP, Perl, Python etc, and they’re not going away any time soon, and developers will continue to be employed to work on them.

Secondly, whilst there is a lot of potential power in the server-ships-documents/client-renders-page model, especially for relatively trivial document-oriented sites, it is NOT fit for every problem domain, especially high-volume sites that require predictable caching behaviours, and for sites which require higher levels of privacy  security. You need a lot more plumbing  workflow in the back-end for these kinds of things.

Thirdly, and I think most importantly, testing. The state of the client-side testing/CI art is *AWFUL*. Yes, there is stuff out there, but it’s immature and clunky. A lot more work needs to be done in this area before it’s ready for prime time. Sure, amongst a group of trendy nerds, as long as your crazy new web app works in the latest Chrome  Safari you’re done, but JS frameworks have not completely done away with browser compatibility issues.

So – yes, the client-side world has come on a lot in the last four or five years. It’s great, and finally standards (whether de-facto or not) are emerging. But the real world is a long way from betting the farm on client-side composition.
  1. Vincent Thorn Says:
    April 22nd, 2011 at 7:29 pm
The most idiotic tendency for the last 15 years is the belief that Browser suits for “interactive applications”. Never ever. Hyper TEXT markup language – enough said. Any application more complex than showing currency rates requires normal, rich client capabilities. Many years I feel sorry for people doing Sisyphean HTML pages – hellishy stupid waste of time. But root of the evil is the W3C – gang of old shoes, who cannot realize that either you can have HQ Publishing (like static PDF) or you have to write normal client application on generic prog.language. They try to crossbreed two perpendicular worlds, throwing you decades back.

Forget about that Sh*tyScript, Java, CGI, forms – If you need application, just write it! For Windows. On Delphi/VB/C#, whatever – IT WILL WORK and never waste your time on browser compatibility, rendering issues, ugly PHP, etc.

Leave a Reply #