This turns out to be a perfect solution for a web server handling a large number of users doing a lot of little things in real-time, like Twitter tweets or Facebook updates or even comments on properties, favorites and x-outs. The problem these programs face isn’t using your computer’s whole processor to solve an advanced physics problem. No, their problem is mostly just storing in a database a fire-hose of tiny bits of data, and zipping information between all the other computers being used to keep a site like Twitter or Facebook up and running.
Ryan’s central insight in a world of networked computers is that it’s never your computer’s processor that slows you down anymore, it’s the inputs and outputs used to connect the computer to file servers and other computers. So rather than just trying to be computationally efficient, Node.JS focuses on being efficient in how a program deals with streams of information.
Even though a big website may use different web servers that have better security or more institutional support, there might be a place alongside those web servers for a Node.JS server as a hub for sending requests to other databases and servers. Suffice it to say, Ryan Dahl’s talk was a fascinating exploration of how technology may change to support real-time streaming functions.
And Ryan was a dazzling speaker, somehow both opinionated and scrupulous. Among the highlights:
On Node.JS’s performance as compared to other web servers: This should be shocking to you. You should be urinating right now. Or getting angry. It shocks me.
On the performance problems created because the popular programming language Ruby on Rails doesn’t make it easy to talk to a database while doing other things : It’s the year 2010, we’re using Rails, and when you access a database, it stops, the world stops for who knows how long, the database might be in LA, and it takes 2 seconds to respond.