Handling Large Files in Node.js
In the previous exercise, we went over how we can use the fs
module to read file contents in Node.js. This works well when we are dealing with small files that are smaller than 100 MB. When we are dealing with large files (> 2 GB), sometimes, it is not possible to read the entire file using fs.readFile
. Consider the following scenario.
You are given a 20 GB text file and you need to process the data in the file line by line and write the output into an output file. Your computer only has 8 GB of memory.
When you are using fs.readFile
, it will attempt to read the entire content of the file into the computer's memory. In our case, this won't be possible because our computer does not have enough memory installed to fit the entire content of the file we are processing. Here, we need a separate approach to this problem. To process large files, we need to use streams.
The stream is an interesting concept in programming. It treats data...