Prerequisite
Before juping straight into reading and writing files, it's essential to have basic understanding of asynchronous Javascript.
If you're new to NodeJS, consider exploring introductory tutorials or resources to build a strong foundation before proceeding.
You may want to start from the beginning of for a smoother reading experience.
As discussed in the previous series parts, NodeJS provides a powerful built-in module called `fs` (short for file system) for interacting with the file system. This module allows us to perform various file-related operations, including reading from and writing to files.
The main focus of this part is to explore the basics of reading from and writing to files in nodeJS.
Quick Overview:
- ➢Asynchronous I/O: Node is designed around asynchronous I/O operations, which means that file operations such as reading and writing are non-blocking.
This allows Node.js to handle multiple file operations concurrently without waiting for each operation to complete before moving on to the next one. - ➢Event-Driven Architecture: Node uses an event-driven architecture, where file I/O operations trigger events that can be handled asynchronously.
This allows for efficient and scalable handling of file operations, especially in scenarios with high concurrency and large numbers of file requests. - ➢Buffers and Streams: Node uses buffers and streams to efficiently read and write data from and to files.
Buffers are temporary storage areas used to hold data during file operations, while streams provide a continuous flow of data between a source and a destination, allowing for efficient handling of large files.
Reading Files:
Reading files is a common task in Node.js applications, whether it's loading configuration files, parsing data from external sources, or serving content to clients.
Node.js offers several methods for reading files, each with its own strengths and use cases.
Methods for reading files and their usecases.
There are many ways to read files in node, but here we'll discuss only a basic and most widely used ones.
1. readFile(): readFile(path[, options], callback)
Reads the contents of a file asynchronously, providing the file contents in a callback function.
Use Cases: Ideal for reading small to medium-sized files where asynchronous operation is preferred, such as loading configuration files or processing user input.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
fs.readFile(file, 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err);
return;
}
console.log('File Contents (fs.readFile()):', data);
});
- path: the path to the target file file.txt
- Options:
- ➢
encoding
: Specifies the encoding of the file contents (default is `utf8`). - ➢
flag
: Specifies the file system flag used when opening the file (default is `r`).
If you're interested in reading about the various file system flags, then have a look at the official doc.
2. readFileSync(): readFileSync(path[, options])
Synchronously reads the contents of a file, blocking the event loop until file contents are read.
Use Cases: Suitable for reading small files where synchronous operation is acceptable, such as during initialization or setup tasks.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
try {
const fileContent = fs.readFileSync(file, 'utf8');
console.log('File Contents (fs.readFileSync()):', fileContent);
} catch (err) {
console.error('Error reading file synchronously:', err);
}
3. promises.readFile():
As with most async APIs in node, since v10, we have promiss.readFile method too.
It Returns a promise that resolves with the file contents, allowing cleaner asynchronous syntax with async/await.
Use Cases: Preferred for modern Node.js applications where promise-based syntax is preferred, enabling better error handling and code readability.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
fs.promises.readFile(file, 'utf8')
.then(data => {
console.log('File Contents (fs.promises.readFile()):', data);
})
.catch(err => {
console.error('Error reading file with promises:', err);
});
4. createReadStream(): createReadStream([options])
This method creates a readable stream for reading large files incrementally, emitting data events as chunks are read.
Use Cases: Ideal for handling large files efficiently, such as log files or media files, where reading the entire file into memory at once is not practical.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
const readStream = fs.createReadStream(file, { encoding: 'utf8' });
readStream.on('data', chunk => {
console.log('Chunk Read (fs.createReadStream()):', chunk);
});
readStream.on('error', err => {
console.error('Error reading file with stream:', err);
});
Writing Files
Writing files is essential for generating reports, storing user-generated content, and persisting data to disk.
Node provides various methods for writing data to files, each with its own advantages and considerations.
Let's start with the simpliest way to write to a file.
1.writeFile(): writeFile(data, options)
This method is used to writes data to a file asynchronously, overwriting existing content if the file already exists.
Use Cases: Suitable for writing small to medium-sized files where asynchronous operation is preferred, such as saving user preferences or logging application events.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
const contentToWrite = 'This is a sample text out to be written to file.txt!';
fs.writeFile(file, contentToWrite, 'utf8', err => {
if (err) {
console.error('Error writing file:', err);
return;
}
console.log('File written successfully (fs.writeFile())');
});
- Options: Here are some options to use
- ➢
encoding
: Specifies the encoding of the data to be written (default is `utf8`). - ➢
mode
: Specifies the file mode (permissions) of the file (default is0o666
). - ➢
flag
: Specifies the file system flag used when opening the file (default is `'w'`). Again you may find the complete list here.
Run the above code and check the newly created file.txt
. Then update contentToWrite and run the code again. This confirms that, the write operation, overwrite the existing files content.
2. writeFileSync(): writeFileSync(file, data[, options])
This method synchronously writes data to a file, and just like it's readFileSync counterpart, it blocks the event loop until data is written to the file.
Use Cases: Suitable for writing small files where synchronous operation is acceptable, such as during initialization or setup tasks.
// index.js
const fs = require('fs');
const file = './play.txt'; // place the target file in the same directory for simplicity
const contentToWriteSync = 'This is a sample text out to be written to file.txt synchronously!';
try {
fs.writeFileSync(file, contentToWriteSync, 'utf8');
console.log('File written successfully synchronously (fs.writeFileSync())');
} catch (err) {
console.error('Error writing file synchronously:', err);
}
NB: You can also use the promise-based fsPromises.writeFile()
method offered by the fs/promises
module:
// index.js
const file = './play.txt'; // place the target file in the same directory for simplicity
const contentToWrite = 'This is a sample text out to be written to file.txt asynchronously with promise pattern!';
fs.promises.writeFile(file, contentToWrite, 'utf8')
.then(() => {
console.log('File written successfully (fs.promises.writeFile())');
})
.catch(err => {
console.error('Error writing file:', err);
return;
});
NB: For all write operations, using the either of these flags (`w+`, `a`, `a+`) will create a new file, if it doesn't already exist. However, using `r+` will throw an exception if the file doesn't exists.
3. createWriteStream(): createWriteStream(path[, options])
This method creates a writable stream for writing data to a file incrementally, allowing for efficient handling of large files.
Use Cases: Ideal for writing large files efficiently, such as log files or media files, where writing the entire file at once is not practical.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
const dataToWrite = 'Data writen to file.txt using stream\n'
const writeStream = fs.createWriteStream(file, { encoding: 'utf8' });
writeStream.on('finish', () => {
console.log('Writing to file.txt completed!')
})
writeStream.write(dataToWrite);
writeStream.end();
Appending content to a file
Sometimes, we really don't want to overwrite the existing content, but rather add to it. Let's see what should be done in such cases.
appendFile(): appendFile(path, data[, options], callback)
This method appends data to a file asynchronously, creating the file if it doesn't exist.
Use Cases: Ideal for scenarios where data needs to be appended to an existing file, such as logging application events or writing to a shared log file.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
const contentToAppend = '\nAppending some extra content to what we already have in file.txt!';
fs.appendFile(file, contentToAppend, 'utf8', err => {
if (err) {
console.error('Error appending to file:', err);
return;
}
console.log('Data appended successfully (fs.appendFile())');
});
We also have a synchrounous version for the above which synchronously appends data to a file, blocking the event loop until data is appended to the file.
// index.js
const fs = require('fs');
const file = './file.txt'; // place the target file in the same directory for simplicity
const contentToAppendSync = '\nAppending some extra content to what we already have in file.txt synchronously!';
try {
fs.appendFileSync(file, contentToAppendSync, 'utf8');
console.log('Data appended successfully synchronously (fs.appendFileSync())');
} catch (err) {
console.error('Error appending to file synchronously:', err);
}
Conclusion
we've covered the basics of reading and writing files in NodeJS. Understanding how file I/O works in Node and the various methods available for reading from and writing to files is essential for building robust and efficient applications.
In the next part, we'll conclude this chapter with working with directories in node.
PAGE CONTENT
Chapter 1 , Part 1 : Introduction to NodeJS
In this series part, I introduce nodeJS and some technical concepts associated with it. I also show how easy it is to setup and start a simple nodeJS web server.
Chapter 1 , Part 2 : How to Install and Setup NodeJS
In this series part, I run you through the various ways to install nodeJS. I also discuss how to install nvm and use it to switch between different node versions.
Chapter 1 , Part 3 : How much JavaScript do you need to learn NodeJS
In this series part, we explore the nuanced relationship between JavaScript and NodeJS, highlighting some subtle distinctions between the two environments.
Chapter 1 , Part 4 : The v8 Engine and the difference Between NodeJS and the browser
In this series part, we explore the V8 engine and how it interacts with nodeJS. We also discuss node’s event loop and uncover the mystery behinds node’s ability to handle concurrent operations.
Chapter 1 , Part 5 : NPM, the NodeJS package manager
Discover the essentials of npm, the powerful package manager for Node.js. Learn installation, management, publishing, and best practices
Chapter 1 , Part 6 : NodeJS in Development Vs Production
Explore how Node.js behaves differently in development and production environments. Learn key considerations for deploying Node.js applications effectively.
Chapter 2 , Part 1 : Asynchronous Flow Control
In this series part, we'll explore various aspects of asynchronous flow control in Node.js, from basic concepts to advanced techniques.
Chapter 2 , Part 2 : Blocking vs Non-blocking I/O
Explore the differences between blocking and non-blocking I/O in Node.js, and learn how to optimize performance and scalability.
Chapter 2 , Part 3 : Understanding NodeJS Event loop
Exploring the Node.js event loop by understanding its phases, kernel integration, and processes enabling seamless handling of asynchronous operations in your applications.
Chapter 2 , Part 4 : The NodeJS EventEmitter
Explore the power of Node.js EventEmitter: an essential tool for building scalable and event-driven applications. Learn how to utilize it effectively!
Chapter 3 , Part 1 : Working with files in NodeJS
Gain comprehensive insights into file management in Node.js, covering file stats, paths, and descriptors, to streamline and enhance file operations in your applications.
Chapter 3 , Part 2 : Reading and Writing Files in NodeJS
Uncover the fundamentals of reading and writing files in nodeJS with comprehensive examples and use cases for some widely used methods.
Chapter 3 , Part 3 : Working with Folders in NodeJS
Unlock the secrets of folder manipulation in Node.js! Explore essential techniques and methods for working with directories efficiently
Chapter 4 , Part 1 : Running NodeJS Scripts
Master the command line interface for executing nodeJS scripts efficiently. Learn common options and best practices for seamless script execution
Chapter 4 , Part 2 : Reading Environment Variables in NodeJS
Learn how to efficiently manage environment variables in nodeJS applications. Explore various methods and best practices for security and portability
Chapter 4 , Part 3 : Writing Outputs to the Command Line in NodeJS
Learn essential techniques for writing outputs in nodeJS CLI. From basic logging to formatting and understanding stdout/stderr.
Chapter 4 , Part 4 : Reading Inputs from the Command Line in NodeJS
Learn the various ways and strategies to efficiently read command line inputs in nodeJS, making your program more interactive and flexible.
Chapter 4 , Part 5 : The NodeJS Read, Evaluate, Print, and Loop (REPL)
Explore the power of nodeJS's Read, Evaluate, Print, and Loop (REPL). Learn how to use this interactive environment for rapid prototyping, debugging, and experimentation.
Chapter 5 , Part 1 : Introduction to Testing in NodeJS
Discover the fundamentals of testing in nodeJS! Learn about testing types, frameworks, and best practices for building reliable applications.
Chapter 5 , Part 2 : Debugging Tools and Techniques in NodeJS
Explore essential debugging tools and techniques in Node.js development. From built-in options to advanced strategies, and best practices for effective debugging.
Chapter 6 , Part 1 : Project Planning and Setup
Discuss the planning and design process for building our interactive file explorer in Node.js, focusing on core features, UI/UX design, and implementation approach and initial setup.
Chapter 6 , Part 2 : Implementing Basic functionalities
In this guide, we'll implement the basic functionalities of our app which will cover initial welcome and action prompts.
Chapter 6 , Part 3 : Implementating Core Features and Conclusion
In this guide, we'll complete the rest of the more advanced functionalities of our app including, create, search, sort, delete, rename and navigate file directories.