Streams are nothing but the objects that allows you to read and write the data from source to destination. Just like strings and arrays, streams are the collection of data. Node js offers us various stream objects.
For example in a node based HTTP server, a request is a readable stream and response is a writable stream.
http.createServer(function(req,res) {
});
Topics Covered
- Type of Streams
- Streams Events
- Readable Streams
- Writable Streams
- Piping the Streams
- Chaining Streams
Type of Streams in Node.js
Readable: Readable stream is used to read a file from source.
var fs = require('fs');
//Create a Readable Stream
var readfile = fs.createReadStream('read-data.txt');
Writable: Writable stream is used to write a file in destination.
var fs = require("fs");
//Create a Writable Stream
var writefile = fs.createWriteStream('write-data.txt');
Duplex : Duplex stream can be used for both read and write operation.
var net = require('net');
//Create direct connection to a server
var client = new net.Socket();
Transform: Transform stream is a type of duplex stream where the output is computed according to the input.
var zlib = require('zlib'); //Compress the file .txt to .txt.gz
.pipe(zlib.createGzip())
Stream Events
Streams can be readable, writable or both. Each type of stream is an instance of an event emitter and handles different events at a different instance of time. Some of the commonly used events are listed below:-
Data : ‘Data’ event is fired when there is data available to read.
Syntax : readfile.on(‘data’, function(chunk) { });
End : ‘End’ event is fired when there is no data available to read.
Syntax : readfile.on(‘end’, function( ) { });
Error : ‘Error’ event is fired when an error occurs during writing or receiving the data.
Syntax : readfile.on(‘error’, function(err ) { });
Finish : ‘Finish’ event is fired by the stream when every chunk of data had been flushed.
Syntax : writefile.on(‘finish’, function( ) { });
Readable Streams
A readable stream allows you to read the data from the source. A source can be anything like a file in a file system, memory buffer, another stream, etc. These streams are event emitters and can emit many events at various points.
Reading from Stream
Reading the data from a stream is the best way to listen to the data events and attaching the callback function with it. Whenever the chunk of data is available, the readable stream will emit data events and execute your callback function. Once all the data has been read from the stream it will emit end event.
First, Create a text file named ‘read-data.txt’ and write the below content in it:-
Hello !!! Welcome to pabbly.com
The tutorials for the beginners.
Then Create a js file named read.js and write the following code:-
var fs = require('fs');
var readfile = fs.createReadStream('read-data.txt');
var data = '';
readfile.on('data', function(chunk) {
data += chunk;
});
readfile.on('end', function() {
console.log(data);
});
readfile.on('error', function(err) {
console.log(err.stack);
});
console.log("Reading Done");
- First, we’ll include ’fs’ module which contains required functionality to create a stream.
- Next, we’ll use the method- createReadStream() to create a readable stream and give the location of our ’read-data.txt’ file.
- We’ll specify the first parameter as data in the read file.on the function which is an event handler. This means whenever data comes into a stream then a callback function is executed. The callback function first converts the read data to a string from the file and then it sends the output i.e. the converted string to the console.
- Here we are taking data chunks (read from data stream) and then convert them into a string.
- At the end, the output of each string is sent to the console.
Run the read.js code to view the result:-
C:\Users\Magnet Brains\Desktop\node.js streams>node read.js
Reading Done
hello!!!
Welcome to pabbly.com
The tutorials for beginners.
The other way to read from a stream is to repeatedly call read() function on the stream until every chunk of data has been read.
Create a js file named read1.js and write the following code:-
read1.js – readfile.read()var fs = require('fs');
var readfile = fs.createReadStream('read-data.txt', 'UTF-8');
var data = '';
var chunk;
readfile.on('readable', function() {
while ((chunk = readfile.read()) != null) {
data += chunk;
}
});
readfile.on('end', function() {
console.log(data)
});
The read() function is used to read the data from the buffer and return the data. When no data is available in the buffer, it returns a null value. Therefore to check for null values and terminate the loop we use the while loop.
Writable Streams
A writable stream allows you to write the data to the destination. Like readable streams, writable streams are also the event emitters. write() and end() are the two function implemented by the writable stream and returns either true or false when you write data in it.
Writing to Stream
We need to call write() function to write the data to a writable stream.
Create a js file named write.js and write the following code:-
write.js – fs.createWriteStream()var fs = require("fs");
var writefile = fs.createWriteStream('write-data.txt');
var data = 'Welcome to pabbly.com';
writefile.write(data, 'UTF8');
writefile.end();
writefile.on('finish', function() {
console.log("Write completed.");
});
writefile.on('error', function(err) {
console.log(err.stack);
});
- First, we’ll include ‘fs’ module which contains required functionality to create a stream.
- Next we will use method- createWriteStream()to create writable stream. And also create an empty file ‘write-data.txt’ in our current directory.
- Then we’ll assign a string to the data variable which will be written to ‘write-data.txt’ file.
- We’ll specify the parameter as data in the write file.write() function with setting UTF8 encoding on the stream. This stream will take care of writing data to the ‘write-data.txt’ file.
Run the write.js code to view the result:-
C:\Users\Magnet Brains\Desktop\node.js streams>node write.js
Write completed.
when you open the write-output.txt file, you will see the following data:-
Welcome to pabbly.com
Piping the Streams
In node js application, the streams can be joined together using the pipe()method. Piping is a mechanism with the help of which we can read the data from the source and write it to the destination without managing the flow.
Create a js file named pipe-stream.js and write the following code:-
pipe-stream.js – readfile.pipe(writefile)var fs = require("fs");
var readfile = fs.createReadStream('read-data.txt');
var writefile = fs.createWriteStream('write-data.txt');
readfile.pipe(writefile);
console.log("Piping done");
- First, we’ll create a ‘readfile’ to ‘read-data.txt’ file, which contains the data we need to transfer to a new file ‘write-data.txt’.
- Next, we’ll create ‘writefile’ to an empty file ‘write-data.txt’ which is a destination where the data will be transferred.
- Then we’ll use the pipe command to pipe both the streams and transfer the data from ‘readfile’ to ‘writefile’.
The pipe command will push all the data to the write stream which comes from the read stream.
Run the pipe.js code to view the result:-
C:\Users\Magnet Brains\Desktop\node.js streams>node pipe-stream.js
Piping done
when you open the write-data.txt file, you will see the following data:-
hello!!!
Welcome to pabbly.com
The tutorials for beginners.
Chaining Streams
Chaining is a mechanism to create a chain of multiple stream operations by connecting the output of one stream to another. It is normally used with the piping operation.
Assume that you have a text document ‘read.txt’ and you want to compress it. There are various ways but the simplest way is to use chaining and piping for compressing the file.
Create a js file named chain-zip.js and write the following code:-
chain-zip.js – zlib.createGzip()var fs = require("fs");
var zlib = require('zlib');
fs.createReadStream('read-data.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('read-data.txt.gz'));
console.log("File is Compressed.");
zlib.createGzip() method is used to compress the ‘read-data.txt’ file to ‘read-data.txt.gz’.
First, we create a simple readable stream from ‘read-data.txt.gz’. Then we pipe this stream to zlib.createGzip() to zip the file content.
Run the pipe.js code to view the result:-
C:\Users\Magnet Brains\Desktop\node.js streams>node chain-zip.js
File is Compressed.
you will find that the ‘read-data.txt’ file is compressed and a new file ‘read-data.txt.gz’ is created in the current working directory.
Now to decompress the same file use the following code:-
Create a js file named chain-unzip.js and write the following code:-
chain-unzip.js – zlib.createGunzip()var fs = require("fs");
var zlib = require('zlib');
// Decompress the file input.txt.gz to input.txt
fs.createReadStream('read-data.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('read-data.txt'));
console.log("File is Decompressed.");
zlib.createGunzip() method is used to decompress the ‘read-data.txt’ file to ‘read-data.txt.gz’.
First, we create a simple readable stream from ‘read-data.txt.gz’. Then we pipe this stream to zlib.createGunzip() to unzip the file content.
Run the chain-unzip.js code to view the result:-
C:\Users\Magnet Brains\Desktop\node.js streams>node chainunzip.js
File is Decompressed.
Learn More-