NodeJS Streams

NodeJS Streams: What are Streams?  The Streams are the Unix pipes that help to read data from source and transfers to the destinations. In other words, the stream is nothing but an EventEmitter that allows implementing methods. Based on the implemented method in Stream, the stream becomes Writable, Readable or Duplex( Readable & Writeable).

NodeJS Streams

In NodeJS the Readable Stream allows you to read the data from source whereas the writeable stream allows writing the data to the destination.

For example, consider NodeJS HTTP Server in that the request is a readable stream, a response is a writable stream and the “fs” module works as both the streams.



Let’s discuss Readable and Writeable Streams in detail.

Readable Stream

The Reading stream allows you to read data from the source. This source can be anything like a simple file, buffer in memory or another stream. The streams are EventEmitters that emits many events at multiple points and use those events as streams.

Reading From Streams

To read data from the stream is just simply listen to data events and add callback. When the data is available, the readable stream emits the data event & executes callbacks.

Reading From Streams Example

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';readableStream.on('data', function(chunk) {
data+=chunk;
});readableStream.on('end', function() {
console.log(data);
});

In the above example, the fs.createReadStream() provides readable stream. At first, the stream is in a static state and after listening to the data event and add a callback. Later a chunk of data will read and passed to the callback. Now the stream implementor specifies how often the data event is emitted. When no more data is there to read then the stream emits an end event.

read() Function

The following example defines another way to read from the stream. Use read() method on the stream continuously till the last chunk of data has read.

read() Function Example

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';
var chunk;
readableStream.on('readable', function() {
while ((chunk=readableStream.read()) != null) {
data += chunk;
}
});
readableStream.on('end', function() {
console.log(data)
});

At first, the read() function reads data from the internal buffer and returns it. If there is nothing to read then it will return null. So the null will help to terminate the loop. Make a note that the readable event is emitted when the chunk of data can be read from the stream.

Encoding Setup

The data that you read from the stream is a buffer object by default. The Node js allows you to set encoding on the stream by calling readable.setEncoding() function.

Encoding Set up Example

var fs = require('fs');
var readableStream = fs.createReadStream('file.txt');
var data = '';
readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
data+=chunk;
});
readableStream.on('end', function() {
console.log(data);
});

Here, we have set the encoding to utf-8 and it is passed as a string to your callback.

Piping

The piping the streams is a great mechanism that helps you to read the data from the source and writes to the destination.

pipe() Example

var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');
readableStream.pipe(writableStream);

Here, the pipe() function writes the content of file1 to file2. The pipe() function concerns the data flow so no need to worry about the sloe or fast data flow. The pipe() returns the destination streams. so you can use a chain of multiple streams together.

Chaining

Suppose you may need to archive and want to decompress it, there are many ways to do it but the best and easiest way is piping and cleaning. Let’s see an example.

Chaining Example

var fs = require('fs');
var zlib = require('zlib');
fs.createReadStream('input.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('output.txt'));

At first, we have created a simple readable stream from the file input.txt.gz. Later we transform (pipe) this stream to another stream zlib.createGunzip() to un-gzip the content. Now the streams are chained, we can add a writeable stream to write the data.

Some Additional Methods

Let’s see some additional methods in the readable streams.

  • Readable.resume()- To resume the paused stream
  • Readable.pause()-To pause the stream. In case the stream is flowing, it won’t emit data events and it just keeps the data in the buffers. Whereas in the non-flowing stream(static stream) it just starts the flowing but data events won’t emit.
  • Readable.unpipe()-To remove destination streams from pipe. In case an argument passed then it stops the readable stream from piping or it removes all the destination Streams.

Writeable Streams

As Readable Streams, the writeable streams allow you to write the data to a destination. The Writeable streams are also EventEmitters that can be emitted from various points. Let’s take a  look at various methods and events in writeable events.

Writing to Stream

The write() function is used to write the data to writeable Stream. Let’s see an example of how to write to the stream.

Writing to Stream Example

var fs = require('fs');
var readableStream = fs.createReadStream('file1.txt');
var writableStream = fs.createWriteStream('file2.txt');
readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});

Here, with the help of the write() function, it just writes the data to the destination after completion of reading chunks of data from the input stream. To describe the operation is successful or not the write () function uses Boolean values. If it is true, you can continue your write operation or if it is false then you cant write further. It will drain the emit event so you can write more data.

End of Data

Sometimes we may not have more data to write then w can call end() function to tell the stream that writing finish here. Let’s assume res is the HTTP response object that you often send to the browser.

res.write('Some Data!!');
res.end('Ended');

After the end() is called and entire data is flushed, a finish event will emit by the stream. Make a note that you cant write data to the stream after calling end(). For that let’s see an example.

res.write('Some Data!!');
res.end('Ended');
res.write('Try to write again');//Error!

Some Additional Writeable Stream Events

  • pipe-If the readable stream is piped into the writeable stream then the writeable stream will emit this event.
  • unpipe-It will emit when you call un pipe on the readable stream and it will stop it from piping into destination streams.
  • error- If any error occurs while writing or piping this will be emitted.

That’s all about the streams in Node JS and Streams, pipes & chaining are the core features of Node JS.