Streams in Node.js provide an efficient way to handle data flow, especially when dealing with large amounts of data.
Streams are objects that let you read or write data chunk by chunk, instead of loading the entire data into memory.
Node.js provides a built-in stream module to work with streams.
Concepts and Examples related to Node.js streams:
There are 4 types of streams in Node.js.
Readable Streams: Used for reading data (e.g., reading from a file, HTTP request, or other sources).
Writable Streams: Used for writing data (e.g., writing to a file or sending an HTTP response).
Duplex Streams: Both readable and writable (e.g., TCP sockets).
Transform Streams: A type of duplex stream that allows data to be modified as it is read or written.
Event-Based API: Streams emit events such as 'data', 'end', and 'error' to notify consumers of data availability, finish, or encountered errors.
Readable Stream Methods: Methods like `read()`, `pause()`, `resume()`, and `pipe()` allows us to control the flow of data from readable streams.
Writable Stream Methods: Methods like `write()`, `end()`, and `destroy()` enable us to write data to writable streams and manage their lifecycle.
Piping: The `pipe()` method allows us to connect readable streams to writable streams, enabling seamless data transfer between them.
Creating a readable stream to read data from a file using the `fs.createReadStream()` method from `fs` module of Node.js
we can consume data from a readable stream using event listeners `data`, `end`, `error`.
const fs = require('fs'); const readableStream = fs.createReadStream('readfile.txt'); // using events emitter for performing all tasks in this section. // we can also use methods like: read, pause, resume and pipe for reading stream. readableStream.on('data', (chunk) => { console.log(`Received ${chunk.length} bytes of data.`); }); readableStream.on('end', () => { console.log('No more data.'); }); readableStream.on('error', (err) => { console.log('Error Message: ', err); });
Creating a writable stream to write data to a file.
We can write data to a writable stream using methods like `write()` or piping data from a readable stream.
const fs = require('fs'); const writableStream = fs.createWriteStream('createfile.txt'); writableStream.write('Hello World, Node.js!'); writableStream.end(); // end stream using method instead of event emitter. writableStream.on('finish', () => { console.log('Write operation finished.'); });
"Duplex stream" reads data from one stream and sends it to another stream also known as a piping stream.
Reading data from a readable stream and transferring to a writable stream.
we can use the `fs.createReadStream()` and `fs.createWriteStream()` methods parallelly to read and write simultaneously. With the help of the `pipe()` method we can bind both streams.
const fs = require('fs'); const readableStream = fs.createReadStream('input.txt'); const writableStream = fs.createWriteStream('output.txt'); // Piping data from the readable stream to the writable stream readableStream.pipe(writableStream);
Transform Stream is a special stream used to transform stream to modify data.
const { Transform } = require('stream'); const upperCaseTransform = new Transform({ transform(chunk, encoding, callback) { const uppercased = chunk.toString().toUpperCase(); callback(null, uppercased); }, }); process.stdin.pipe(upperCaseTransform).pipe(process.stdout);