Node is also huge on streams and streaming
Streams can be thought of as data distributed over time. By bringing data in chunk by chunk, the developer is given the ability to handle that data as it comes in instead of waiting for it all to arrive before acting
Example
var stream = fs.createReadStream('./resource.json')
stream.on('data', function (chunk) {
console.log(chunk)
})
stream.on('end', function () {
console.log('finished')
})
A data event is fired whenever a new chunk of data is ready, and an end event is fired when all the chunks have been loaded. A chunk can vary in size, depending on the type of data. This low-level access to the read stream allows you to efficiently deal with data as it’s read instead of waiting for it all to buffer in memory
Readable and writable streams can be connected to make pipes, much like we can do with the | (pipe) operator in shell scripting. This provides an efficient way to write out data as soon as it’s ready, without waiting for the complete resource to be read and then written out.
var http = require('http');
var fs = require('fs');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'image/png'});
fs.createReadStream('./image.png').pipe(res);
}).listen(3000);
console.log('Server running at http://localhost:3000/');
In this one-liner, the data is read in from the file (fs.createReadStream) and is sent out (.pipe) to the client (res) as it comes in. The event loop is able to handle other events while the data is being streamed.