Streams are a common class to use in Node.js. Streams allow you the developer, to send data in chunks, rather than all at once to a file or some other output. Today, we talk about the types of streams, important stream methods, and the benefit of streams. Let’s start with stream types.
There are four types of streams; these are readable, writable, duplex, and transform. Readable and writable streams read and write data respectively into the stream’s buffer. Duplex can read and write. Finally, transform are duplex streams that can transform data as it is being read in. For most developers, the readable and writeable streams will be the most used. Next, let’s talk about methods.
Streams are based on the EventEmitter class in Node.js. As a result, streams are event driven. When a certain action is taken on a stream, such as data being obtained, the stream emits an event to process data elsewhere within its internal state. Furthermore, a developer can listen to these events with event handlers to perform other operations when the events are emitted. Now, for most developers, that means you will be working with event handlers when using these stream methods on their respective stream type. Let’s start with writable streams.
Writable streams represent output; they have the important methods write, setDefaultEncoding, end, and destroy. The write method writes data to the stream; this is similar to writing a message to your console output. Next, is the method setDefaultEncoding, which changes the way data is written; in most cases, this is “utf8”. The end method writes remaining data to the stream and prevents further write calls the stream. Finally, there is destroy, which destroys the stream entirely.
Now, writable streams are only one part of the equation; readable streams are the next important one.
Readable streams act as input; they read in data onto the internal buffer like any other stream. As such, they have these important methods: read, setDefaultEncoding, resume, pipe, unpipe, and destroy. The read method reads data into the internal buffer. setDefaultEncoding works just like writable streams version. pause and resume stop and resume the flow data respectively. Pipe and unpipe add or remove writable streams that data will be passed to. Finally, destroy destroys the stream.
These are all the important methods for use with streams in most situations. Both Duplex and Transform have all these methods, but you’ll rarely use these streams. Now, let’s get down to the benefits of streams.
Streams are beneficial when working with large chunks of data. By using streams, you can open up channels to send data to. For example, piping data to multiple files at the same time. Live streaming music to keep load times low. Processing video while streaming it through your application or game. In essence, streams help reduce load times and overhead for your projects. And allow you to manipulate data in tight spaces. Here’s a fun example of using streams to write to the console and multiple files:
In this example, we use the “fs” (file system) module to read and write files. Then, we create writable streams in the form of two test files. We use process.stdin (standard input) as our readable stream, and use the data in the internal buffer to pipe data directly into our three writeable streams (filestream1, filestream2, and process.stdout). Then to end our input while running the program, we simply press Ctrl+C on our keyboard.