node-pg-migration is a useful library. You can add a constraint of a foreign key with a simple references property on the column. For example, you have table client_session with publishable_key column and you want to reference publishable_key column in workspace table where it is not a primary key (just a unique key). The following usage with object: schema, name, won’t work:
If you write a migration for Postgress and Node.js using node-pg-migration (e.g., table 1) and want to create a foreign key reference / constraint (createConstraint or references on createTable) on a non-primary key of the other table (table 2), then references by an object containing schema and table name, do not work. This is probably a bug in node-pg-migration. It tries to match primary keys only, but our publishable_key is not a PK in workspace.
It looks like node-pg-migration is not smart enough to add the column/field name like it does for primary keys. You can use a string (seam.table2(col)) as the references value, not an object, or just use raw SQL to bypass node-pg-migration behavior, e.g.,
I recently needed to update the check constraint on a Postgres table column. This check does format validation similar to regex. This check was part of the CREATE TABLE statement meaning it was created with the table.
However, ALTER COLUMN is not supporting updates of checks. So what we need to do is to drop the constraint and create a new one. dropConstraint requires the name of the constraint, but the name is not obvious. This is because check was create with createTable. To find the name, there’s a SQL query we can run:
SELECT con.*
FROM pg_catalog.pg_constraint con
INNER JOIN pg_catalog.pg_class rel
ON rel.oid = con.conrelid
INNER JOIN pg_catalog.pg_namespace nsp
ON nsp.oid = connamespace
WHERE nsp.nspname = '<schema name>'
AND rel.relname = '<table name>';
A few years ago, I managed a team at DocuSign that was tasked with re-writing the main DocuSign web app which was used by tens of millions of users. The APIs didn’t exist yet to support our new shiny front-end app because since the beginning the web app was a .NET monolith. The API team in Seattle was taking the monolith apart and exposing RESTful APIs slowly. This API team consisted of just two engineers and had a release cycle of one month. Our front-end team in San Francisco released every week. The API team release cycle was so long because a lot of (almost all) the functionality had to be tested manually. That’s understandable. It was a monolith without proper automated test coverage after all—when they modified one part, they never knew what can go wrong in other parts of the application.
The Practical Node.js, 2nd Edition print book is finally ready. It turned out the biggest thickest book I ever wrote (500+ pages). Practical Node, 2nd Ed. is even thicker than React Quickly.
The Practical Node.js, 2nd Edition print book is finally ready. It turned out the biggest thickest book I ever wrote (500+ pages). Practical Node, 2nd Ed. is even thicker than React Quickly.
Node Foundation published an enterprise case study about Node.js usage at Capital One. The title “After Call For Innovation from C-Suite, Node.js Pops Up All Over Capital One” and you can download the full case study from nodejs.org.
I get this question very often: “What tools would you recommend for Node development?” Software engineers love to optimize and increase productivity instead of wasting their time. I bet you are one of them! Read on to find out the best Node tools for development.
IDEs/code editors
Libraries
GUI tools
CLI tools
IDEs/code editors
When it comes to your primary tool, the code editor, I recommend sticking with lighter and simpler editors like Atom or VS Code instead of full-blown IDEs like Webstorm. Of course an IDE will do more for you but this comes with a learning curve and the need to configure. Node is interpreted, thus there’s no need to compile it. The files are just plain text files with the .js extension.
Here’s my list of the best Node editors:
Atom: created and maintained by GitHub; uses Electron, HTML, JS and CSS under the hood which makes it very easy to customize or add functionality; allows to have Git and terminal support via packages. Price: free.
VS Code: a newer addition; uses similar to Atom web-based tech; was created from Azure’s Monaco editor; comes with debugging, smart autocomplete based on types, Git and terminal support. Price: free.
WebStorm: more of an IDE than an editor, developed by JetBrains and based on IntelliJ platform; has code assistance, debugging, testing, Git. Price: starts at $59/yr for individuals.
There are more options like Brackets, Sublime Text 3 and of course IDEs like Eclipse, Aptana Studio, NetBeans, Komodo IDE, and cloud-based like Cloud 9, Codenvy.
What to pick? Any of the three in the list is good choice. I have heard good things about VS Code and their smart autocomplete is a nice thing, but I didn’t find it a good enough reason for me to switch from Atom. So try VS Code and Atom and see which one you like more. Both of them offer a wide variety of packages and themes.
The most popular and useful libraries and project dependencies
Here’s the list of the most used and most popular modules which you would install as dependencies of your projects. Node developers use most of these modules (or alternatives) in almost all of their projects.
The libraries are listed with the npm names, so you can execute npm i {name} substituting {name} with the name of the package/module:
webpack: Builds static assets like browser JavaScript, CSS and even images. It allows to use node modules in the browser.
babel: Allows to code in the latest versions of JavaScript/ECMAScript without having to worry about your runtime by converting the new code to the code compatible with older versions of ECMAScript
async: Controls flow by running function concurrently, sequentially or any way you want
concurrently: Allows to execute CLI tools (local) as multiple processes all at the same time, e.g., webpack and node-static.
Note: Some of the libraries/tools listed above like webpack or mocha, can be installed globally instead of locally in your project folder. However, installing them globally is an old practice and currently is an anti-pattern because local installation allows developers to use multiple versions of the tool with different projects in addition to have these tools specified in package.json.
Of course there are a lot of different options in each category. For example, request and superagent are also extremely popular HTTP agent libraries. However, I don’t want to give too many options and confuse you with the differences, I listed only one tool (typically the one I use the most currently).
CLI tools (global)
Unlike the previous section, these tools are okay to install globally since most likely their version won’t affect or break your project.
node-dev: Monitor and restart your Node app automatically on any file change within the current folder
node-inspector: Debug Node code in a familiar interface of DevTools (now part of Node starting with v7)
docker: Build and run Docker containers to isolate app environment, speed up deployment and eliminate conflicts between dev and prod (or any other) environments
curl: Make HTTP(S) requests to test your web apps (default for POSIX but can get for Windows too)
nvm: Change Node versions without having to install and re-install them each time
wintersmith: Build static website using Node templates and Markdown
pm2: Process manager to vertically scale Node processes and ensure fail-tolerance and 0-time reload
GUI tools
A good share of Node developers prefer GUI (graphical user interface) tools at least for some of the tasks because these tools require less typing and have features which makes them more productive and the development easier and simpler.
Postman: HTTP client with ability to save requests and history, change formats (JSON, form, etc.) and do other things
MongoUI: Modify and inspect your MongoDB data in a web interface. You can host this web app on your server to enable the database management.
Chrome: DevTools is a great way to inspect your requests, network, traffic, CPU profiles and other developer related data which is very useful for debugging
iTerm, itermocil and zsh: A better alternative to a native macOS Terminal app which together with itermocil and zsh increases productivity greatly
I went to Node Interactive Europe which happened in September in Amsterdam, the official Node conference—the real deal. Organizers invited me to present on React, so I taught a workshop on Universal Web, and also participated on a panel discussion about containers and Node with folks from nearFrom, IBM, Zeit and Netflix. You should watch the panel recording on YouTube. It was a good one.
In the previous post, we learned how to perform HTTP/2 server push in a Node server. We also covered the benefits of server push there so to avoid duplication we won’t list them here. We used spdy for server push and H2. But most of the times Node developers don’t work with core HTTP server, they use a framework like Express. So let’s see how we can implement server push in Express.
The modern Internet with its TCP/IP protocol started around 1975 which is astonishing 41 years ago. For the most part of its existence, we used HTTP and it’s successor HTTP/1.1 (version 1.1) to communicate between clients and servers. It served the web well but the way developers build websites has dramatically changed. There are myriads of external resources, images, CSS files, JavaScript assets. The number of resources is only increasing.
HTTP2 is the first major upgrade to the good old HTTP protocol in over 15 years (first HTTP is circa 1991)! It is optimized for modern websites. The performance is better without complicated hacks like domain sharding (having multiple domains) or file concatenation (having one large file instead of many small ones).
There’s a better alternative to the ubiquitous JSON as the communication protocol of the web. It’s Protocol Buffers (protobuf). In a nutshell, protobuf offers a more dense format (faster processing) and provides data schemas (enforcement of structure and better compatibility with old code).
The purpose of this article is not to highlight why protobufs are better or sell you on the concept. There are many article online that’ll do it for you. The purpose of this article is to show you how you can get started with this format in the Node.js environment.
This essay was inspired by the Kyle Simpson’s series of books, You Don’t Know JavaScript. They are a good start with JavaScript fundamentals. Node is mostly JavaScript except for a few differences which I’ll highlight in this essay. The code is in the You Don’t Know Node GitHub repository under the code folder.
Why care about Node? Node is JavaScript and JavaScript is almost everywhere! What if the world can be a better place if more developers master Node? Better apps equals better life!
This is a kitchen sink of subjectively the most interesting core features. The key takeaways of this essay are:
Event loop: Brush-up on the core concept which enables non-blocking I/O
Global and process: How to access more info
Event emitters: Crash course in the event-based pattern
Streams and buffers: Effective way to work with data
Clusters: Fork processes like a pro
Handling async errors: AsyncWrap, Domain and uncaughtException
C++ addons: Contributing to the core and writing your own C++ addons
Event Loop
We can start with event loop which is at the core of Node.
It allows processing of other tasks while IO calls are in the process. Think Nginx vs. Apache. It allows Node to be very fast and efficient because blocking I/O is expensive!
Take look at this basic example of a delayed println function in Java:
Before we can get started with Node patterns, let’s touch on some of the main advantages and features of using Node. They’ll help us later to understand why we need to deal with certain problems.
Node Advantages and Features
Here are some of the main reasons people use Node:
JavaScript: Node runs on JavaScript so you can re-use your browser code, libraries and files.
Asynchronous + Event Driven: Node executes tasks concurrently with the use of asynchronous code and patterns, thanks to event loop.
Non-Blocking I/O: Node is extremely fast due to its non-blocking input/output architecture and Google Chrome V8 engine.
That’s all neat but async code is hard. Human brains just didn’t evolve to process things in an asynchronous manner where event loop schedules different pieces of logic in the future. Their order often is not the same order in which they were implemented.
My new book Full Stack JavaScript (my 4th traditionally-published book) comes with a series of screencast videos for better immersion in a wonderful and mesmerizing world of Node.js, Backbone and MongoDB. It’s a one thing to read through the text and another to follow up with dynamic videos which walk you through the book’s projects.
The videos and the source code are open source, meaning they are publicly available. Therefore, you don’t have to buy a book—you can just watch the 14 videos on YouTube (playlist) and go through the code on GitHub (repository).
Last week, I presented my talk at the inaugural Node Interactive ’15, in Portland, Oregon. It’s probably the largest Node.js conference in the world! My talk was on Node.js at Capital One. You might wonder: bank and Node.js? What they have in common? The best kept secret, which is not a secret at all is that Capital One, is moving into being a technology company with a focus on finance… not just a bank. It’s worth watching my talk if you’re are interested in hear about challenges of bringing innovation to a large company in a heavily regulated field.
Have you ever wanted to learn basics of Node.js and the most popular Node.js web framework Express.js? If you are experienced web developer or software engineer who wants to learn Node.js and build some servers along the way, then this self-study workshop is for you.
What is ExpressWorks? It’s an automated tool which allows to learn Express.js from the author of one of the best books on Express.js—Pro Express.js— with this workshop that will teach you basics of Express.js and building Node.js web apps (a.k.a. servers).
You will walk through adventures via command-line interface. Each adventure has a problem, hints, and the solutions.
Last week, I attended the HTML5Dev conference in San Francisco which was just across from Capital One SF office at 201 3rd St. The conference was split across a few building which made it hard to navigate and find talks.
The whole conference was along the lines of React is amazing, ES6 is the future and Node.js is everywhere. There were a few talks on the Internet of Things, design, UX and HTTP/2 as well. Here’s the recap of the talks to which I went to.
Node.js/Io.js is a non-blocking I/O platform based on the JavaScript language. It has been capturing the hearts and minds of software developers for the past couple of years. It has been doing this not only for developers in startups And small companies, but more interestingly for enterprise developers as well. For example, PayPal, LinkedIn, Walmart, Netflix, eBay, Uber and other tech giants switched from Java and the likes to Node. Its popularity is attributed to better performance and having one language (JavaScript) for all layers: back-end, database and front-end.
As with any new platform, there are a lot of Node.js/Io.js frameworks to choose from. However, before we proceed, we need to define what enterprise means. For the sake of simplicity, an enterprise project is one where you have teams of more than 10 developers working on it, where you have huge traffic to handle and high stakes, meaning the services must be running 24x7x365.
Judging frameworks is highly subjective. When it comes to building enterprise-level applications, we need to consider some of the following things:
Best practices and patterns: Whether the framework is DIY or provides clear patterns to use.
Configuration: How easy it is to configure the framework.
Convention: Is there a convention to follow if that’s the preferred route?
Horizontal scaling: How easy it is to scale apps built with this framework.
Testing: How to test the application.
Scaffolding: How much developers have to code manually vs. using built-in code generators.
Monitoring: How to monitor the application
Track record: How proven a framework is, i.e., who supports it and how well it is maintained.
Integration: How rich the ecosystem of plugins/connectors is.
ORM/ODM: Is there an object relational/document mapper.
While performance is important, it varies on the requirements and business logic of a particular project. Running meaningful benchmark tests is non-trivial.
The main focus of this post is to compare the four Node.js/Io.js frameworks: Hapi, Kraken, Sails.js and Loopback.
I hated Jade as many other Node.js developes do. But I changed 180 after I realized that it has tons of features.
At Storify and DocuSign we used Jade for EVERYTHING. We used Jade even in the browser. There is a little trick called jade-browser. It was developed by folks at Storify. I maintained it for a bit.
The funny thing is that DocuSign team used jade-browser long before they met me. They swear they hired me without knowing that I was involved in that library. :-)
Anyway, after covering Jade and Handlebars in previous posts, it’s time to apply them to do some real work. In this post, I’ll cover:
I’ve wrote how I struggled with Jade, but I had no choice except to master it. However, before beginning to understand Jade, I admired Handlebars GREATLY. I did it mostly for its simplicity and similarity with plain HTML.
If you want to write templates for Node.js apps, then consider Handlebars. This short tutorial will get you started on the path of becoming a pro. And if you haven’t even heard about Handlebars, then you’re missing out big time!