Back to home

Transform Streams in Node.js

Published: Jul 12, 2024

Last updated: Jul 12, 2024

Overview

At this point in our Node.js Streams blog series, we covered the core fundamentals you should know about Node.js streams and spent a blog post diving into both Readable streams and Writable streams.

In this shorter blog post, we will cover the fourth and final fundamental stream type Transform streams in this blog post.

How to think about Transform streams

Transform streams are a special type of Duplex stream (which we covered in the last blog post). We use Transform streams for transforming data (very apt naming) as it passes through the stream.

In the Node.js streams fundamentals blog post, the mental model we used for Transform streams was the the water treatment plant.

The water treatment plant is our mental model for Transform streams in Node.js

The water treatment plant is our mental model for Transform streams in Node.js

Some parallels with the analogy to make it memorable:

  1. Dual Nature: Transform streams are both readable and writable. A water treatment plant receives untreated water (input) and outputs clean water (output).
  2. Data Transformation: Modifies input data before passing it as output. Similarly, various processes in the plant (filtration, chemical treatment) change the water's composition.
  3. Streaming Processing: Handles data in chunks, not all at once. We can think of this like how water flows continuously through the plant, not in single large batches.
  4. Backpressure Handling: Manages flow when output is slower than input. Reservoirs or holding tanks regulate water flow when treatment is slower than intake.
  5. Piping: Can be easily connected to other streams. Treatment plants can also be connected to various water sources and distribution systems
  6. Error Handling: Manages issues during transformation. We can think of them like the safety protocols and backup systems in the plant handle treatment failures.

These analogies help tie together the core concepts of Transform streams, as well as a few of the fundamentals that we covered at the start of this series.

With that out of the way, let's see how we can create Transform streams in Node.js.

How to create Transform streams

Similar to all the previous posts, we can create Transform streams using the node:stream module in a few ways:

  1. Using the stream.Transform class: We can extend the stream.Transform class to create our Transform stream.
  2. Using the stream.Transform constructor: We can use the stream.Transform constructor to create a Transform stream.

Extending the stream.Transform class

If we extend the stream.Transform class, we can provide a _transform method that will be called for each chunk of data that passes through the stream.

const { Transform } = require("stream"); class UpperCaseTransform extends Transform { constructor() { super(); } _transform(chunk, encoding, callback) { try { this.push(chunk.toString().toUpperCase()); callback(); } catch (error) { callback(error); } } } const upperCaseTransform = new UpperCaseTransform(); process.stdin.pipe(upperCaseTransform).pipe(process.stdout);

In the above example, we pipe the process.stdin readable stream to our UpperCaseTransform stream, which converts the input to uppercase. We then pipe the output to process.stdout.

Using the stream.Transform constructor

We could also have implemented the stream calling new Transform directly:

const { Transform } = require("stream"); const upperCaseTransform = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); }, }); process.stdin.pipe(upperCaseTransform).pipe(process.stdout);

When creating an instance of a Transform stream, we supply a transform function that will be called for each chunk of data that passes through the stream.

The above code works the same as the previous example, but it's a bit more concise.

Use cases for Transform streams

The use cases are a subset of what we covered in our use cases for Duplex streams in the prior post.

I won't recap the entire list here, but here are a few examples of where you might use Transform streams:

Compression streams

With zlib, we can create a gzip compression Transform stream. We've covered this example before in the Writable streams blog post.

const zlib = require("node:zlib"); const gzip = zlib.createGzip(); // gzip is a transform stream (both readable and writable) - we cover this more in a future post

Here is the more complete example where we compress a file using gzip and write it out to a new file:

const fs = require("node:fs"); const zlib = require("node:zlib"); const readStream = fs.createReadStream("input.txt"); const writeStream = fs.createWriteStream("input.txt.gz"); const gzip = zlib.createGzip(); readStream.pipe(gzip).pipe(writeStream); writeStream.on("finish", () => console.log("File successfully compressed"));

Crypto streams

We've also covered this example before in the Writable streams blog post.

const crypto = require("crypto"); const cipher = crypto.createCipher("aes192", "secret"); // cipher is a transform stream (both readable and writable) - we cover this more in a future post

The Transform stream here is the cipher stream. We can use it to encrypt data as it passes through the stream.

const crypto = require("node:crypto"); const fs = require("node:fs"); const algorithm = "aes-192-cbc"; const password = "Password used to generate key"; const key = crypto.scryptSync(password, "salt", 24); const iv = Buffer.alloc(16, 0); const cipher = crypto.createCipheriv(algorithm, key, iv); const input = fs.createReadStream("input.txt"); const output = fs.createWriteStream("encrypted.txt"); input.pipe(cipher).pipe(output); output.on("finish", () => console.log("File encrypted successfully"));

Conclusion

Today's post is a short one that covers the special type of Duplex stream: Transform streams.

We've covered the water treatment plant analogy to help you remember the core concepts of Transform streams, how to create them, and a few use cases where you might use them. You should review the previous posts in this series to get a full understanding of how Transform streams fit into the broader picture of Node.js streams, particularly now that you're understanding of Node.js streams is becoming more complete.

In the next post, we will cover the pipeline API in Node.js, which is a powerful way to connect streams together.

Resources and further reading

Disclaimer: This blog post used AI to generate the images used for the analogy.

Photo credit: blendertimer

Personal image

Dennis O'Keeffe

@dennisokeeffe92
  • Melbourne, Australia

Hi, I am a professional Software Engineer. Formerly of Culture Amp, UsabilityHub, Present Company and NightGuru.
I am currently working on Visibuild.

1,200+ PEOPLE ALREADY JOINED ❤️️

Get fresh posts + news direct to your inbox.

No spam. We only send you relevant content.

Transform Streams in Node.js

Introduction

Share this post