Note: This text is a part of upcoming ebook JavaScript and Node FUNdamentals: A Collection of Essential Basics.
Node.js is a highly efficient and scalable non-blocking I/O platform that was build on top of Google Chrome V8 engine and its ECMAScript. This means that most front-end JavaScript (another implementation of ECMAScript) objects, functions and methods are available in Node.js. Please refer to JavaScript FUNdamentals if you need a refresher on JS-specific basics.
Developers can install Node.js from its website and follow this overview of main Node.js concepts. For more meticulous Node.js instructions, take a look at Rapid Prototyping with JS: Agile JavaScript Development and Node School.
Read-Eval-Print Loop (a.k.a. Console) in Node.js
Like in many other programming language and platforms, Node.js has a read-eval-print loop tool which is opened by $ node
command. The prompt changes to >
and we can execute JavaScript akin to Chrome Developer Tools console. There are slight deviations in ECMAScript implementations in Node.js and browsers (e.g., {}+{}
), but for the most part the results are similar.
So as you see, we can write JavaScript in the console all day long, but sometime we can to save script so we can run them later.
Launching Node.js Scripts
To start a Node.js script from a file, simply run $ node filename
, e.g., $ node program.js
. If all we need is a quick set of statements, there’s a -e
option that allow to run inline JavaScript/Node.js, e.g., $ node -e "console.log(new Date());"
.
Node.js Process Information
Each Node.js script that is running is a process in its essence. For example, ps aux | grep 'node'
will output all Node.js programs running on a machine. Conveniently, developers can access useful process information in code with process
object, e.g., node -e "console.log(process.pid)"
:
Accessing Global Scope in Node.js
As you know from JS FUNdamentals, browser JavaScript by default puts everything into its global scope. This was coined as one of the bad part of JavaScript in Douglas Crockford’s famous [JavaScript: The Good Parts]. Node.js was designed to behave differently with everything being local by default. In case we need to access globals, there is a global
object. Likewise, when we need to export something, we should do so explicitly.
In a sense, window
object from front-end/browser JavaScript metamorphosed into a combination of global
and process
objects. Needless to say, the document
object that represent DOM of the webpage is nonexistent in Node.js.
Exporting and Importing Modules
Another bad part in browser JavaScript is that there’s no way to include modules. Scripts are supposed to be linked together using a different language (HTML) with a lacking dependency management. CommonJS and RequireJS solve this problem with AJAX-y approach. Node.js borrowed many things from the CommonJS concept.
To export an object in Node.js, use exports.name = object;
, e.g.,
var messages = {
find: function(req, res, next) {
...
},
add: function(req, res, next) {
...
},
format: 'title | date | author'
}
exports.messages = messages;
While in the file where we import aforementioned script (assuming the path and the file name is route/messages.js
):
var messages = require('./routes/messages.js');
However, sometime it’s more fitting to invoke a constructor, e.g., when we attach properties to Express.js app (more on Express.js in Express.js FUNdamentals: An Essential Overview of Express.js). In this case module.exports
is needed:
module.exports = function(app) {
app.set('port', process.env.PORT || 3000);
app.set('views', __dirname + '/views');
app.set('view engine', 'jade');
return app;
}
In the file that includes the example module above:
...
var app = express();
var config = require('./config/index.js');
app = config(app);
...
The more succinct code: var = express(); require('./config/index.js')(app);
.
The most common mistake when including modules is a wrong path to the file. For core Node.js modules, just use the name without any path, e.g., require(‘name’). Same goes for modules in node_modules
folder. More on that later in the NPM section.
For all other files, use .
with or without a file extension, e.g.,
var keys = require('./keys.js'),
messages = require('./routes/messages.js');
In addition, for the latter category it’s possible to use a longer looking statements with __dirname
and path.join()
, e.g., require(path.join(__dirname, ,’routes’, ‘messages’));`
If require()
points to a folder, Node.js will attempt to read index.js
file in that folder.
Buffer is a Node.js Super Data Type
Buffer is a Node.js addition to four primitives (boolean, string, number and RegExp) and all-encompassing objects (array and functions are also objects) in front-end JavaScript. We can think of buffers as extremely efficient data stores. In fact, Node.js will try to use buffers any time it can, e.g., reading from file system, receiving packets over the network.
__dirname vs. process.cwd
__dirname
is an absolute path to the file in which this global variable was called, while process.cwd
is an absolute path to the process that runs this script. The latter might not be the same as the former if we started the program from a different folder, e.g., $ node ./code/program.js
.
Handy Utilities in Node.js
Although, the core of the Node.js platform was intentionally kept small it has some essential utilities such as
The method that we use in this tutorials is path.join
and it concatenates path using an appropriate folder separator (/
or \\
).
Reading and Writing from/to The File System in Node.js
Reading from files is done via the core fs
module. There are two sets of methods: async and sync. In most cases developers should use async methods, e.g., fs.readFile:
var fs = require('fs');
var path = require('path');
fs.readFile(path.join(__dirname, '/data/customers.csv'), {encoding: 'utf-8'}, function (err, data) {
if (err) throw err;
console.log(data);
});
And the writing to the file:
var fs = require('fs');
fs.writeFile('message.txt', 'Hello World!', function (err) {
if (err) throw err;
console.log('Writing is done.');
});
Streaming Data in Node.js
Streaming data is a term that mean an application processes the data while it’s still receiving it. This is useful for extra large datasets, like video or database migrations.
Here’s a basic example on using streams that output the binary file content back:
var fs = require('fs');
fs.createReadStream('./data/customers.csv').pipe(process.stdout);
By default, Node.js uses buffers for streams.
For a more immersive training, take a loot at stream-adventure and Stream Handbook.
Installing Node.js Modules with NPM
NPM comes with the Node.js platform and allows for seamless Node.js package management. The way npm install
work is similar to Git in a way how it traverses the working tree to find a current project. For starter, keep in mind that we need either the package.json
file or the node_modules
folder, in order to install modules locally with $ npm install name
, for example $ npm install superagent
; in the program.js: var suparagent = requier('superagent');
.
The best thing about NPM is that it keep all the dependencies local, so if module A uses module B v1.3 and module C uses module B v2.0 (with breaking changes comparing to v1.3), both A and C will have their own localized copies of different versions of B. This proves to be a more superior strategy unlike Ruby and other platforms that use global installations by default.
The best practice is not to include a node_modules
folder into Git repository when the project is a module that supposed to be use in other application. However, it’s recommended to include node_modules
for deployable applications. This prevents a breakage caused by unfortunate dependency update.
Note: The NPM creator like to call it npm
(lowercase).
Hello World Server with HTTP Node.js Module
Although, Node.js can be use for a wide variety of tasks, it’s mostly knows for building web applications. Node.js is thrives in the network due to its asynchronous nature and build-in modules such as net and http.
Here’s a quintessential Hello World examples where we create a server object, define request handler (function with req and res arguments), pass some data back to the recipient and start up the whole thing.
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');]
The req and res parameters have all the information about a given HTTP request and response correspondingly. In addition, req and res can be used as streams (look in the previous section).
Debugging Node.js Programs
The best debugger is console.log()
, but sometime we need to see the call stack and orient ourselves in async code a bit more. To do that, put debugger
statements in your code and use $ node debug program.js
to start the debugging process. For more developer-friendly interface, download node inspector.
Taming Callbacks in Node.js
Callbacks are able to Node.js code asynchronous, yet programmers unfamiliar with JavaScript, who come from Java or PHP, might be surprised when they see Node.js code described on Callback Hell:
fs.readdir(source, function(err, files) {
if (err) {
console.log('Error finding files: ' + err)
} else {
files.forEach(function(filename, fileIndex) {
console.log(filename)
gm(source + filename).size(function(err, values) {
if (err) {
console.log('Error identifying file size: ' + err)
} else {
console.log(filename + ' : ' + values)
aspect = (values.width / values.height)
widths.forEach(function(width, widthIndex) {
height = Math.round(width / aspect)
console.log('resizing ' + filename + 'to ' + height + 'x' + height)
this.resize(width, height).write(destination + 'w' + width + '_' + filename, function(err) {
if (err) console.log('Error writing file: ' + err)
})
}.bind(this))
}
})
})
}
})
There’s nothing to be afraid of here as long as two-space indentation is used. ;-) However, callback code can be re-write with the use of event emitters, promises or by utilizing the async library.
Introduction to Node.js with Ryan Dahl
Last, but not least:
Moving Forward with Express.js
After you’ve mastered Node.js basics in this article, you might want to read Express.js FUNdamentals: An Essential Overview of Express.js and consider working on an interactive class about the Express.js framework which is as of today is the most popular module on NPM.
@theraptor
I agree that Udemy is not the best platform. It has bugs and it’s slow… that’s why I host some videos on YouTube as well as write books and host live in-person events.
Come join my office hours this Sat, Jan 23 at 12pm PT. It’s free.
Register at http://goo.gl/forms/pgVpohS2KO
I almost bought you “udemy” class on Nodejs, MongoDB&Expressjs.
I clicked to learn more and was confronted with some tiny window that wasn’t formed etc (https://www.udemy.com/lecture/view/?lectureId=1856150&paymentPlanId=1097377458).
I spent 3 months putzing around with ruby/json/puppet/vagrant/vm/jscript/coffescript/19+ gems & learned them myself – it would have been nice to take a class to have a walk through instead of bending & breaking all the outdated tutorials from last year around the new security procedures. I know udemy is probably not your page & presentation layer – just that I pay $50 for a book, that I can keep for years – not going to spend money on a class walk through that has a difficult to read introduction.
awesome post–neither a beginner nor an expert in Node. some excellent repl tips here.
Hi Zenobius,
I appreciate the comment about Nave. However, having multiple versions of Node.js is out of the scope of fundamentals/basics. If we follow your logic, there are dozens of things that worth mentioning, but that will dilute the article and make it bloated. ;-) Maybe you can write Node.js Advanced FUNdamentals…
PS: I work with a single version on Node.js all the time (at work and at home). Therefore, you’re assumption that “no one ever works on one project at a time” is flawed. I have one project at work. ;-)
Nice intro, but if you’re going to talk about NPM, then you also need to talk about Nave or NVM too.
No-one ever works on one project at a time, and the likely hood of all your projects being on the same version of Node is fairly remote.
Additionally most nodejs frameworks provide some really handy binaries for the commandline (SailsJs, Kraken, Compound), none of which you’ll be able to use on linux if you don’t use the -g switch. But if you do use that switch on linux you then have to use sudo, which now puts you in the sticky situation of installing all your node framework binaries into one central location (remember what we said before about keeping nodejs version separate?)
Using nave, means you can sandbox all your npm modules to a specific nodejs version and still get the node binaries without using sudo.
Npm installing locally into ./node_modules is only half the step. Nave completes the rest of that momentum.