Welcome back to Node.js Streaming 101! π
If you havenβt read Part 1 (Intro to Streams) or Part 2 (Duplex Streams), I highly recommend checking them out first. Today, weβre talking about one of the most powerful and versatile types of streams in Node.js: Transform Streams.
π What Are Transform Streams?
A Transform Stream is a special kind of Duplex Stream that modifies the data as it passes through. Think of it like a real-time filter:
- Input: A stream of text
- Processing: Convert it to uppercase
- Output: The modified text
Unlike Duplex Streams (which send and receive separate data), Transform Streams take in data, modify it, and send it forwardβall in one flow.
π₯ Where Are Transform Streams Used?
- Compression & Decompression π¦ β (e.g., Gzip, Brotli)
- Encryption & Decryption π β (e.g., AES, Hashing)
- Data Processing π β (e.g., converting text formats, JSON transformation)
- Real-time Modifications π β (e.g., replacing words in a text stream)
π Creating a Simple Transform Stream
Letβs start by building a basic Transform Stream that converts text to uppercase.
import { Transform } from "stream";
class UppercaseTransform extends Transform {
_transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
}
const transformStream = new UppercaseTransform();
process.stdin.pipe(transformStream).pipe(process.stdout);
π Whatβs Happening?
- We create a custom Transform Stream (
UppercaseTransform
). _transform(chunk, encoding, callback)
:- Converts text to uppercase.
- Pushes the transformed data forward.
- We pipe
process.stdin
βtransformStream
βprocess.stdout
. - Anything typed into the terminal gets converted to uppercase in real-time.
π§ͺ Try It Out!
- Run the script (
node transform-stream.mjs
). - Type something and hit Enter.
- The text will be converted to uppercase!
hello world
HELLO WORLD
Boom! π You've just created your first Transform Stream.
π Using Transform Streams with pipeline()
As we saw in Part 1, pipeline()
makes handling streams much cleaner and manages error handling automatically.
Letβs refactor our uppercase stream using pipeline()
:
import { Transform, pipeline } from "stream/promises";
class UppercaseTransform extends Transform {
_transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
}
async function processStream() {
const transformStream = new UppercaseTransform();
try {
await pipeline(process.stdin, transformStream, process.stdout);
} catch (err) {
console.error("Pipeline failed:", err);
}
}
processStream();
β
Why use pipeline()
?
- Handles errors automatically.
- Prevents memory leaks.
- Supports async/await, making it modern and readable.
β‘ Real-World Example: Compressing Files with Gzip
A practical use of Transform Streams is file compression using Gzip.
import { createReadStream, createWriteStream } from "fs";
import { pipeline } from "stream/promises";
import { createGzip } from "zlib";
async function compressFile() {
try {
await pipeline(
createReadStream("largefile.txt"),
createGzip(), // Transform stream compresses the data
createWriteStream("largefile.txt.gz")
);
console.log("File compressed successfully!");
} catch (err) {
console.error("Compression failed:", err);
}
}
compressFile();
π Whatβs Happening?
- Read file β Compress with Gzip β Write to new file.
- Uses
createGzip()
, which is a built-in Transform Stream. - Automatically handles backpressure & errors.
π― Final Thoughts
Transform Streams are incredibly powerful for modifying data on the fly. Whether you're compressing files, encrypting data, or processing real-time input, they give you full control over the data flow.
Missed the earlier parts? Check out:
Let me know if you have any questions, and happy coding! π