Node.js Stream API is a powerful feature that allows you to work with streams of data, enabling efficient processing of large datasets. Streams in Node.js provide a way to handle data in chunks, which is particularly useful when dealing with large files or network operations. There are four types of streams in Node.js: Readable, Writable, Duplex, and Transform.
Here's a basic overview of how to use the Node.js Stream API for efficient data processing:
javascriptconst fs = require('fs');
const stream = require('stream');
Use fs.createReadStream
to create a readable stream for reading data from a file. You can also create a custom Readable stream if your data source is different.
javascriptconst readableStream = fs.createReadStream('input.txt');
Use fs.createWriteStream
to create a writable stream for writing data to a file. You can also create a custom Writable stream if you want to process the data differently.
javascriptconst writableStream = fs.createWriteStream('output.txt');
Use the pipe
method to connect the readable stream to the writable stream. This allows data to flow from the readable stream to the writable stream efficiently in chunks.
javascriptreadableStream.pipe(writableStream);
Handle events to perform actions at different stages of the stream, such as 'data' for handling chunks of data, 'end' for when the stream ends, and 'error' for handling errors.
javascriptreadableStream.on('data', (chunk) => {
// Process each chunk of data
console.log('Received chunk:', chunk);
});
readableStream.on('end', () => {
// All data has been read
console.log('Read operation complete');
});
readableStream.on('error', (err) => {
// Handle errors
console.error('Error:', err);
});
If you need to modify the data as it passes through the stream, you can use a Transform stream. Create a custom Transform stream and implement the _transform
method.
javascriptconst transformStream = new stream.Transform({
transform(chunk, encoding, callback) {
// Modify the data as needed
const modifiedChunk = chunk.toString().toUpperCase();
this.push(modifiedChunk);
callback();
}
});
readableStream.pipe(transformStream).pipe(writableStream);
This is a basic example, and Node.js Stream API provides more advanced features and options. Streams are particularly useful for processing large datasets without loading the entire content into memory, leading to improved performance and reduced memory consumption.