nodejs-built-in
Here's a list of 20 key Node.js built-in modules that are important for Node.js development. Each module is briefly described to provide an understanding of its use case.
1. fs:
The fs
module provides an API for interacting with the file system in a manner closely modeled around standard POSIX functions. It allows you to read from and write to files, watch file changes, and interact with the file system.
fs:
The fs
(File System) module in Node.js is used for file-based operations. It provides a way to interact with the file system in a manner similar to POSIX functions.
Basic fs Example:
Here's a simple example of using the fs
module to read a file asynchronously and log its contents to the console.
const fs = require('fs');
// Asynchronously read the contents of a file
fs.readFile('example.txt', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err);
return;
}
console.log(data);
});
In this basic example, fs.readFile
is used to read the contents of example.txt
asynchronously. If the operation is successful, the contents of the file are logged to the console.
Advanced fs Example: For a more advanced use case, you might want to watch for changes in a directory and perform actions when a file is added, changed, or removed.
const fs = require('fs');
// Watch the directory for changes
fs.watch('path/to/directory', (eventType, filename) => {
console.log(`Event type is: ${eventType}`);
if (filename) {
console.log(`Filename provided: ${filename}`);
// Perform actions depending on the event type and filename
if (eventType === 'rename') {
// A file has been added or deleted
console.log('A file has been added or deleted');
} else if (eventType === 'change') {
// A file has been modified
console.log('A file has been modified');
}
} else {
console.log('Filename not provided');
}
});
In the advanced example, fs.watch
is used to monitor a directory for changes. When a file within the directory is added, removed, or changed, the callback function is called with the type of event and the name of the file that was affected. This allows for more dynamic and responsive file system operations, such as auto-updating services or live-reload features.
2. path:
The path
module contains utilities for handling and transforming file paths. Operations like joining, resolving, or normalizing file paths are common uses of this module.
path:
The path
module in Node.js provides utilities for working with file and directory paths. It can perform a variety of operations, such as joining multiple path segments, resolving paths to absolute paths, and normalizing paths to remove redundant segments.
Basic Example:
For basic file path operations like joining and normalizing, here’s how you could use the path
module:
- Use
path.join()
to concatenate path segments into a single path. - Use
path.normalize()
to resolve any ".." or "." segments.
import path from 'path';
// Joining path segments
const directory = 'users';
const fileName = 'john.txt';
const fullPath = path.join(directory, 'subdir', '..', fileName);
console.log(fullPath); // Outputs: 'users/john.txt'
// Normalizing a path
const weirdPath = 'users/john/../.././jane.txt';
const normalizedPath = path.normalize(weirdPath);
console.log(normalizedPath); // Outputs: 'jane.txt'
In this example, path.join()
is used to create a path string, intelligently managing separators, and path.normalize()
is used to clean up a path that contains redundant parts.
Advanced Example:
In more advanced scenarios, you might use the path
module to resolve relative paths to absolute paths, extract parts of the path, or handle cross-platform path issues.
- Use
path.resolve()
to convert a relative path to an absolute path. - Use
path.dirname()
,path.basename()
, andpath.extname()
to extract different parts of a path. - Use
path.sep
to get the platform-specific path segment separator.
import path from 'path';
// Resolving a relative path to an absolute path
const relativePath = 'files/john.txt';
const absolutePath = path.resolve(relativePath);
console.log(absolutePath); // Outputs an absolute path based on the current working directory
// Extracting parts of a path
const filePath = '/users/john/docs/readme.txt';
const dirName = path.dirname(filePath);
const baseName = path.basename(filePath);
const extName = path.extname(filePath);
const baseNameWithoutExt = path.basename(filePath, extName);
console.log(`Directory: ${dirName}`); // Outputs: '/users/john/docs'
console.log(`Full file name: ${baseName}`); // Outputs: 'readme.txt'
console.log(`Extension: ${extName}`); // Outputs: '.txt'
console.log(`File name without extension: ${baseNameWithoutExt}`); // Outputs: 'readme'
// Handling cross-platform path segment separator
const segments = ['users', 'john', 'docs', 'readme.txt'];
const crossPlatformPath = segments.join(path.sep);
console.log(crossPlatformPath); // Outputs a path with the correct separators for the current OS
In the advanced example:
path.resolve()
is used to convert a relative path into an absolute path by resolving it against the current working directory.path.dirname()
,path.basename()
, andpath.extname()
are used to extract the directory name, base file name, and file extension, respectively.path.sep
is used to construct paths with the appropriate separator for the underlying operating system, ensuring cross-platform compatibility.
3. http:
The http
module enables the ability to create an HTTP server that listens to server requests and returns responses. It can also be used to make HTTP client requests.
http:
The http
module in Node.js is a low-level API for network communications, which you can use to create an HTTP server or make HTTP client requests.
Basic HTTP Server Example Code Explanation:
import http from 'http';
// Create an HTTP server
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello, World!');
});
// The server listens on port 3000
server.listen(3000, () => {
console.log('Server running on port 3000');
});
In the basic example:
- We import the
http
module, which is included in Node.js by default. - We create a server that handles HTTP requests and sends a simple text response 'Hello, World!'.
- The server is set to listen on port 3000.
Advanced HTTP Client Request Example Code Explanation:
import http from 'http';
// The options for the HTTP request
const options = {
hostname: 'example.com',
port: 80,
path: '/some/path',
method: 'GET',
};
// The HTTP request itself
const req = http.request(options, (res) => {
console.log(`STATUS: ${res.statusCode}`);
console.log(`HEADERS: ${JSON.stringify(res.headers)}`);
res.setEncoding('utf8');
res.on('data', (chunk) => {
console.log(`BODY: ${chunk}`);
});
res.on('end', () => {
console.log('No more data in response.');
});
});
req.on('error', (e) => {
console.error(`problem with request: ${e.message}`);
});
// End the request
req.end();
In the advanced example:
- We define the
options
object that contains the details of the HTTP request. - We use the
http.request
method to make a GET request toexample.com
on the path/some/path
. - The response is logged to the console, including status code, headers, and body chunks as they're received.
- Any errors during the request are caught and logged.
- The
req.end()
call finalizes the request, signaling to the server that all of the request headers and body have been sent.
4. https:
Similar to the http
module, the https
module supports HTTP over TLS/SSL, which means it can create a secure server that encrypts the communication.
https:
The https
module in Node.js is used to create HTTPS servers and clients. It's similar to the http
module but provides the added security of TLS/SSL encryption.
Basic HTTPS Server: To create a basic HTTPS server, you'll need SSL certificates. For local development, you can generate self-signed certificates. In a production environment, you would use certificates from a certificate authority (CA).
import https from 'https';
import fs from 'fs';
const options = {
key: fs.readFileSync('path/to/key.pem'),
cert: fs.readFileSync('path/to/cert.pem')
};
https.createServer(options, (req, res) => {
res.writeHead(200);
res.end('hello world\n');
}).listen(8000);
In this code, we are creating a secure server that listens on port 8000. The options
object includes the SSL key and certificate, which are read from the file system.
Advanced HTTPS Server with Express:
When integrating with Express, you'll pass your app
to https.createServer
along with the SSL options.
import express from 'express';
import https from 'https';
import fs from 'fs';
const app = express();
// Define routes and middlewares for your express app
app.get('/', (req, res) => {
res.send('Secure Site Home Page');
});
const options = {
key: fs.readFileSync('path/to/key.pem'),
cert: fs.readFileSync('path/to/cert.pem'),
// You can include additional options like `ca` for intermediate certificates
// and `passphrase` if your private key is encrypted
};
// Create an HTTPS server with the express app and the SSL options
https.createServer(options, app).listen(8443, () => {
console.log('HTTPS server running on port 8443');
});
In the advanced example, we are integrating the HTTPS server with an Express application. This allows you to use all the features of Express while still maintaining the security benefits provided by HTTPS. The server listens on port 8443, which is commonly used for HTTPS traffic.
Redirect HTTP to HTTPS: It's a common practice to redirect all incoming HTTP traffic to HTTPS to ensure secure communication.
import http from 'http';
import express from 'express';
import https from 'https';
import fs from 'fs';
const app = express();
// Define routes and middlewares for your express app
app.get('/', (req, res) => {
res.send('Secure Site Home Page');
});
const httpsOptions = {
key: fs.readFileSync('path/to/key.pem'),
cert: fs.readFileSync('path/to/cert.pem')
};
// Create an HTTPS server
https.createServer(httpsOptions, app).listen(8443, () => {
console.log('HTTPS server running on port 8443');
});
// Create an HTTP server that redirects to HTTPS
http.createServer((req, res) => {
res.writeHead(301, { 'Location': `https://${req.headers.host}${req.url}` });
res.end();
}).listen(8080, () => {
console.log('HTTP server running on port 8080 and redirecting to HTTPS');
});
In this redirect example, an HTTP server listens on port 8080. When it receives a request, it sends a 301 Moved Permanently
response, redirecting the client to the same host and URL on HTTPS.
5. os:
The os
module provides utilities related to the operating system, such as fetching information about the system's CPU, memory, and network interfaces.
os:
The os
module in Node.js is a utility for working with operating system-related tasks. It provides functions to get system information like CPU, memory, and network interfaces.
Basic os Example:
Here's a simple example that uses the os
module to log the operating system platform and the total memory available.
const os = require('os');
// Log the OS platform (e.g., 'darwin', 'win32', 'linux')
console.log('Operating System Platform:', os.platform());
// Log the total system memory
console.log('Total System Memory:', os.totalmem(), 'bytes');
In this basic example, os.platform()
returns a string identifying the operating system platform, and os.totalmem()
returns the total amount of system memory in bytes.
Advanced os Example: For a more advanced scenario, you might gather more detailed system information, such as the load averages, CPU details, and network interface information.
const os = require('os');
// Log the system load averages (1, 5, and 15 minutes)
console.log('Load Averages:', os.loadavg());
// Log information about each CPU core
os.cpus().forEach((cpu, index) => {
console.log(`CPU ${index}:`, cpu.model);
});
// Log the free memory available in bytes
console.log('Free Memory:', os.freemem(), 'bytes');
// Log details about each network interface
const networkInterfaces = os.networkInterfaces();
for (const interface in networkInterfaces) {
console.log(`Interface ${interface}:`);
networkInterfaces[interface].forEach(detail => {
console.log(` Address: ${detail.address}, Family: ${detail.family}, Internal: ${detail.internal}`);
});
}
In the advanced example, os.loadavg()
provides an array with the load averages, os.cpus()
gives an array of objects containing information about each logical CPU core, os.freemem()
returns the free system memory in bytes, and os.networkInterfaces()
returns an object containing network interfaces that have been assigned a network address. This kind of information can be very useful for creating system monitoring tools or for logging hardware details for debugging purposes.
6. querystring:
The querystring
module provides utilities for parsing and formatting URL query strings. It's handy for working with data passed in URLs.
querystring:
The querystring
module in Node.js provides utilities to parse query strings into readable JavaScript objects and to stringify JavaScript objects into query strings. This can be useful when you need to work with data passed in URLs.
Basic Example:
For basic operations, you can parse a query string into an object and then stringify a JavaScript object back into a query string.
- Use
querystring.parse()
to convert a query string to an object. - Use
querystring.stringify()
to convert an object to a query string.
import querystring from 'querystring';
// Parsing a query string into an object
const qs = 'name=John+Doe&age=30';
const parsedQs = querystring.parse(qs);
console.log(parsedQs); // Outputs: { name: 'John Doe', age: '30' }
// Stringifying an object into a query string
const obj = { name: 'Jane Doe', age: 25 };
const stringifiedObj = querystring.stringify(obj);
console.log(stringifiedObj); // Outputs: 'name=Jane%20Doe&age=25'
In this basic example, querystring.parse()
turns a query string into an easily accessible object, and querystring.stringify()
takes an object and encodes it into a URL-friendly query string format.
Advanced Example:
In more complex scenarios, you might need to handle arrays and nested objects, or customize how query strings are parsed and stringified with different encodings or delimiters.
- Use custom delimiters and encodings when parsing and stringifying.
import querystring from 'querystring';
// Parsing a query string with custom delimiters
const qsComplex = 'name=John%20Doe;age=30';
const parsedQsComplex = querystring.parse(qsComplex, ';');
console.log(parsedQsComplex); // Outputs: { name: 'John Doe', age: '30' }
// Stringifying an object into a query string with custom encoding
const objComplex = { name: 'John & Jane', age: 30 };
const stringifiedObjComplex = querystring.stringify(objComplex, null, null, {
encodeURIComponent: querystring.escape
});
console.log(stringifiedObjComplex); // Outputs: 'name=John%20%26%20Jane&age=30'
// Handling arrays during stringification
const objWithArray = { color: ['red', 'green', 'blue'] };
const stringifiedObjWithArray = querystring.stringify(objWithArray);
console.log(stringifiedObjWithArray); // Outputs: 'color=red&color=green&color=blue'
In the advanced example:
querystring.parse()
is used with a custom delimiter (;
) for parsing a query string with a non-standard format.querystring.stringify()
is customized with anencodeURIComponent
function to handle special characters in the query string.- The
querystring.stringify()
function is also demonstrated with an object containing an array, showing how it handles multiple values for the same key.
7. stream:
The stream
module provides an API for implementing the stream interface. Streams are collections of data that might not be available all at once and don't have to fit in memory. This makes streams a powerful way of working with large amounts of data or data that's coming from an external source one chunk at a time.
stream:
The stream
module in Node.js is used for handling streaming data. Streams can handle large amounts of data in chunks without having to read it all into memory first, which is great for performance.
Basic Readable Stream Example Code Explanation:
import fs from 'fs';
import { Readable } from 'stream';
// Create a readable stream from a file
const readableStream = fs.createReadStream('largefile.txt', 'utf8');
readableStream.on('data', (chunk) => {
console.log(`Received ${chunk.length} bytes of data.`);
});
readableStream.on('end', () => {
console.log('There is no more data to read.');
});
In the basic example:
- We use the
fs
module to create a readable stream from a file namedlargefile.txt
. - The
data
event is emitted each time a chunk of data is ready to be processed. - The
end
event is emitted when there is no more data to read from the stream.
Advanced Transform Stream Example Code Explanation:
import { Transform } from 'stream';
// Create a transform stream that converts input to uppercase
const upperCaseTr = new Transform({
transform(chunk, encoding, callback) {
// Convert the chunk to a string and then to uppercase
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Process.stdin is a readable stream representing the standard input
// Process.stdout is a writable stream representing the standard output
process.stdin.pipe(upperCaseTr).pipe(process.stdout);
// Now, any text input into the standard input will be converted to uppercase and output to the standard output
In the advanced example:
- We create a
Transform
stream which is a type ofDuplex
stream that can be used to modify data as it is read and written. - The
transform
method is overridden to take the input data chunk, convert it to a string, change it to uppercase, and push it back onto the stream. - We use
pipe
to connectprocess.stdin
to our transform stream, and then pipe the output of the transform stream toprocess.stdout
. - This effectively creates a simple command-line utility that converts any input text to uppercase in real-time.
8. url:
The url
module provides utilities for URL resolution and parsing. It comes in handy when you need to extract different parts of a URL or when you need to create a URL string from its constituent parts.
url:
The url
module in Node.js provides utilities for URL parsing and resolution, making it easier to work with URLs.
Basic URL Parsing: Here's how you can parse a URL to access its different components:
import url from 'url';
const myUrl = new URL('https://www.example.com:8000/path/name?query=string#hash');
console.log(myUrl.hostname); // "www.example.com"
console.log(myUrl.pathname); // "/path/name"
console.log(myUrl.search); // "?query=string"
console.log(myUrl.hash); // "#hash"
console.log(myUrl.port); // "8000"
console.log(myUrl.protocol); // "https:"
In this example, we create a new URL
object by passing the URL string to the URL
constructor. Then, we can access the different parts of the URL such as the hostname, pathname, search query, hash fragment, port, and protocol.
Advanced URL Operations:
You can also use the url
module to resolve URLs or to create a URL object from parts, which is useful when constructing URLs dynamically:
import url from 'url';
// Constructing a URL from parts
const myUrl = new URL('/path/name', 'https://www.example.com:8000');
myUrl.searchParams.append('query', 'string');
myUrl.hash = 'hash';
console.log(myUrl.href); // "https://www.example.com:8000/path/name?query=string#hash"
// Resolving a relative URL against a base
const base = new URL('https://www.example.com/base/path');
const resolvedUrl = new URL('../new/path', base);
console.log(resolvedUrl.href); // "https://www.example.com/new/path"
The first part of the advanced example shows how to create a new URL by providing the path and base URL separately, then adding a query string and hash dynamically. The searchParams
property is used to manipulate the query string of the URL.
The second part demonstrates how to resolve a relative URL against a base URL using the URL
constructor. This is particularly handy when dealing with relative paths and you need to find out the absolute URL.
9. util:
The util
module includes utility functions that are helpful for programming tasks, such as formatting strings, debugging, and inspecting objects.
util:
The util
module in Node.js contains various utility functions that are primarily designed to support the needs of Node.js' internal APIs, but many of them are useful for application code.
Basic util Example:
Here's an example of using util.format
to concatenate strings and values in a printf-like manner.
const util = require('util');
// Use util.format to format a string
const message = util.format('My %s has %d years', 'cat', 2);
console.log(message);
In this basic example, util.format
acts similarly to printf
in other languages, where %s
is used as a placeholder for a string and %d
for a number.
Advanced util Example:
For more complex use cases, you might use util.promisify
to convert callback-based functions (which conform to the error-first callback pattern) into functions that return a Promise.
const util = require('util');
const fs = require('fs');
// Convert fs.readFile into a Promise-based function
const readFile = util.promisify(fs.readFile);
async function readConfigFile() {
try {
const data = await readFile('config.json', 'utf8');
const config = JSON.parse(data);
console.log(config);
} catch (err) {
console.error('Error reading file:', err);
}
}
readConfigFile();
In the advanced example, util.promisify
is used to transform fs.readFile
, which normally uses callbacks, into a function that returns a Promise. This allows for the use of async/await syntax for better readability and error handling. This approach is particularly useful when modernizing legacy Node.js codebases or when you want to utilize async/await
with modules that don't natively return Promises.
10. events:
The events
module provides the ability to create, listen to, and emit events. It's a core piece of Node.js's event-driven architecture.
events:
The events
module is one of the core modules in Node.js that allows you to work with the EventEmitter class, which is used to handle events. An instance of the EventEmitter class can emit events that can be listened to by any number of functions.
Basic Example:
Here's how you can create an event emitter, listen for events, and emit them:
- Import the
events
module and create a new instance ofEventEmitter
. - Use the
.on()
method to listen for events. - Use the
.emit()
method to trigger an event.
import { EventEmitter } from 'events';
// Create an instance of the EventEmitter class
const emitter = new EventEmitter();
// Listen for a 'greet' event
emitter.on('greet', () => {
console.log('Hello world!');
});
// Emit a 'greet' event
emitter.emit('greet');
In this basic example, when the 'greet' event is emitted, the listener function is invoked and logs 'Hello world!' to the console.
Advanced Example:
For more advanced usage, you may have parameters passed with events, handle multiple listeners, and work with asynchronous events.
- Use the
.on()
method to listen for events with parameters. - Use
async
/await
with event handlers. - Use the
.once()
method to listen for an event only once.
import { EventEmitter } from 'events';
interface User {
id: number;
username: string;
}
// Create an instance of the EventEmitter class
const userEmitter = new EventEmitter();
// Listen for a 'userAdded' event with a parameter
userEmitter.on('userAdded', (user: User) => {
console.log(`User added with username: ${user.username}`);
});
// Listen for an event only once
userEmitter.once('userRemoved', (userId: number) => {
console.log(`User with ID ${userId} removed`);
});
// Asynchronously emit events
const addUser = async (user: User) => {
// Simulate some async operation like a database insert
await new Promise(resolve => setTimeout(resolve, 100));
userEmitter.emit('userAdded', user);
};
const removeUser = async (userId: number) => {
// Simulate some async operation like a database delete
await new Promise(resolve => setTimeout(resolve, 100));
userEmitter.emit('userRemoved', userId);
};
// Call the async functions to emit the events
addUser({ id: 1, username: 'john_doe' });
removeUser(1);
In the advanced example:
- The
userEmitter
listens foruserAdded
anduserRemoved
events. - The
userAdded
event is emitted after simulating an asynchronous operation, passing aUser
object to the listener. - The
userRemoved
event is emitted similarly but is set to listen only once using.once()
, after which it will not react to subsequentuserRemoved
events.
11. buffer:
The buffer
module helps in dealing with binary data directly. Buffers are instances of the Buffer
class, which is designed to handle raw binary data similar to an array of integers.
buffer:
The buffer
module in Node.js is used for handling raw binary data. Buffers are instances of the Buffer
class in Node.js, which is globally available – you don't need to import it using require
or import
.
Basic Buffer Usage Example Code Explanation:
// Create a buffer from a given string
const bufFromString = Buffer.from('Hello, World!', 'utf-8');
// Log the buffer and string
console.log(bufFromString); // <Buffer 48 65 6c 6c 6f 2c 20 57 6f 72 6c 64 21>
console.log(bufFromString.toString()); // "Hello, World!"
// Create a buffer with allocated size of 16 bytes and fill it with zeros
const bufAllocated = Buffer.alloc(16);
// Fill the buffer with a string
bufAllocated.write('Hello, World!', 0, 'utf-8');
// Log the allocated buffer and its string content
console.log(bufAllocated); // <Buffer 48 65 6c 6c 6f 2c 20 57 6f 72 6c 64 21 00 00 00>
console.log(bufAllocated.toString()); // "Hello, World!"
In the basic example:
- We create a
Buffer
from a string withBuffer.from()
, specifying the encoding (UTF-8). - We allocate a new
Buffer
withBuffer.alloc()
, which initializes the allocated memory with zeroes for safety. - We use the
write
method to fill the buffer with a string.
Advanced Buffer Operations Example Code Explanation:
// Create two buffers with different contents
const buf1 = Buffer.from('1234');
const buf2 = Buffer.from('0123');
// Compare buffers
const bufComparison = Buffer.compare(buf1, buf2); // Returns a number indicating whether buf1 comes before, after, or is the same as buf2 in sort order.
// Log comparison result
console.log(bufComparison); // A number: 1 if buf1 is after buf2, -1 if buf1 is before buf2, 0 if they are equal
// Copy buffer
const targetBuffer = Buffer.alloc(8); // Allocate a buffer to copy into
buf1.copy(targetBuffer); // Copy buf1 into targetBuffer
// Log the target buffer
console.log(targetBuffer.toString()); // "1234"
// Slice a buffer – creates a new buffer that references the same memory as the original, but offset and cropped by the start and end indices.
const bufSlice = buf1.slice(0, 2);
// Log the sliced buffer
console.log(bufSlice.toString()); // "12"
// Concatenate buffers
const bufConcatenated = Buffer.concat([buf1, buf2]);
// Log the concatenated buffer
console.log(bufConcatenated.toString()); // "12340123"
In the advanced example:
- We compare two buffers using
Buffer.compare()
, which can be used to sort buffers or check if they are the same. - We copy data from one buffer to another with
buf1.copy(targetBuffer)
. - We create a slice of a buffer using
buf1.slice(0, 2)
, which does not copy the data; instead, the new buffer represents a portion of the memory of the original buffer. - We concatenate multiple buffers into a new buffer with
Buffer.concat([buf1, buf2])
.
12. crypto:
The crypto
module provides cryptographic functionality that includes a set of wrappers for OpenSSL's hash, HMAC, cipher, decipher, sign, and verify functions.
crypto:
The crypto
module in Node.js includes a variety of cryptographic functions. It's built on OpenSSL, which is a robust, full-featured open-source cryptographic library.
Basic Cryptographic Hash: Here's a basic example of creating a hash of a string using the SHA-256 algorithm, which is a common cryptographic hash function:
import crypto from 'crypto';
// Creating a hash object
const hash = crypto.createHash('sha256');
// Updating data
hash.update('some data to hash');
// Calculating the hash digest
const digest = hash.digest('hex');
console.log(digest); // Prints the hash digest as a hex string
In this code, createHash
is used to create a hash object, update
is used to input the data, and digest
is used to generate the hash.
Advanced Cryptographic Operations: The following example demonstrates how to encrypt and decrypt a piece of data using the AES-256-CBC algorithm, which is a strong encryption standard.
import crypto from 'crypto';
const algorithm = 'aes-256-cbc';
const password = 'password';
const salt = crypto.randomBytes(16);
const key = crypto.scryptSync(password, salt, 32); // Key derivation from password
const iv = crypto.randomBytes(16); // Initialization vector
const cipher = crypto.createCipheriv(algorithm, key, iv);
let encrypted = cipher.update('some clear text data', 'utf8', 'hex');
encrypted += cipher.final('hex');
console.log(encrypted); // Encrypted data in hex format
const decipher = crypto.createDecipheriv(algorithm, key, iv);
let decrypted = decipher.update(encrypted, 'hex', 'utf8');
decrypted += decipher.final('utf8');
console.log(decrypted); // Decrypted data, should be 'some clear text data'
In this advanced example:
- We generate a random salt for the key derivation and an initialization vector (IV) for the encryption.
- We derive a key from a password using the
scryptSync
function. - We create a cipher object using
createCipheriv
, passing in the algorithm, key, and IV. - We update the cipher with the text to be encrypted and finalize it.
- We then create a decipher object using
createDecipheriv
and decrypt the data back to its original form.
It's important to note that in a real-world application, you should handle the key and IV with great care, and they should be securely stored and managed.
13. net:
The net
module provides an asynchronous network API for creating stream-based TCP or IPC servers (net.createServer()) and clients (net.createConnection()).
net:
The net
module in Node.js is used for creating network servers and clients. It provides an asynchronous network API for working with TCP and IPC.
Basic net Example: Below is a simple example of creating a TCP server that listens for connections on port 3000. When a client connects, it sends "Hello, client!" to the client and logs any data received.
const net = require('net');
// Create a TCP server
const server = net.createServer((socket) => {
console.log('client connected');
socket.write('Hello, client!\n');
socket.on('data', (data) => {
console.log('Data from client:', data.toString());
});
socket.on('end', () => {
console.log('client disconnected');
});
});
server.on('error', (err) => {
console.error('Server error:', err);
});
server.listen(3000, () => {
console.log('Server listening on port 3000');
});
In this basic example, the net.createServer()
method creates a new TCP server. The server listens on port 3000, writes a greeting to each connected client, and logs any data it receives.
Advanced net Example: An advanced example would be creating a TCP server that can handle multiple client connections, broadcasting messages to all connected clients.
const net = require('net');
const clients = [];
// Create a TCP server
const server = net.createServer((socket) => {
console.log('Client connected');
clients.push(socket);
socket.on('data', (data) => {
console.log('Data from a client:', data.toString());
// Broadcast the message to all connected clients
clients.forEach((client) => {
if (client !== socket) {
client.write(data);
}
});
});
socket.on('end', () => {
console.log('Client disconnected');
// Remove the client from the list of connected clients
const index = clients.indexOf(socket);
if (index !== -1) {
clients.splice(index, 1);
}
});
socket.on('error', (err) => {
console.error('Socket error:', err);
});
});
server.on('error', (err) => {
console.error('Server error:', err);
});
server.listen(3000, () => {
console.log('Server listening on port 3000');
});
In the advanced example, a list of clients is maintained, and when a message is received from one client, it is broadcast to all other clients. This could be the starting point for a simple chat server. The server also handles client disconnections by removing the client from the list of active clients.
14. child_process:
The child_process
module enables you to execute other applications or scripts in a new process, allowing you to leverage multiple CPU cores or run tasks in the background.
child_process:
The child_process
module in Node.js is used to create new processes. You can execute shell commands, spawn new Node.js processes, or invoke any executable from your Node.js application.
Basic Example:
Here's a basic example of how to use the child_process
module to execute a shell command:
- Use
child_process.exec()
to execute a command in a shell and buffer the output.
import { exec } from 'child_process';
// Execute a simple shell command
exec('ls -lh', (error, stdout, stderr) => {
if (error) {
console.error(`Error: ${error.message}`);
return;
}
if (stderr) {
console.error(`Stderr: ${stderr}`);
return;
}
console.log(`Stdout: ${stdout}`);
});
In this example, exec
is used to run the ls -lh
command, which lists files in the current directory, and callbacks are used to handle the output or errors.
Advanced Example:
For more advanced use cases, you can spawn a new process with child_process.spawn()
, which provides a stream interface for handling data in real-time. This is useful for long-running processes or when you expect a large amount of output.
- Use
child_process.spawn()
to start a process and handle the streams.
import { spawn } from 'child_process';
// Spawn a new process to run a 'node' command
const child = spawn('node', ['some_script.js']);
// Handle the stdout data
child.stdout.on('data', (data) => {
console.log(`Stdout: ${data}`);
});
// Handle the stderr data
child.stderr.on('data', (data) => {
console.error(`Stderr: ${data}`);
});
// Handle the close event
child.on('close', (code) => {
console.log(`Child process exited with code ${code}`);
});
In the advanced example:
spawn
is used to create a new Node.js process runningsome_script.js
.- The
stdout
andstderr
streams are listened to for data events, which occur when the child process writes to its standard output or standard error. - The
close
event is listened to for when the process exits, which includes the exit code of the process.
These are basic and advanced uses of the child_process
module, allowing for simple execution of commands as well as more complex interaction with child processes.
15. dns:
The dns
module contains functions related to DNS resolution and lookups. It can be used to perform name resolution between domain names and IP addresses.
dns:
The dns
module in Node.js provides functions for performing network DNS lookups and name resolution. It allows you to interact with the domain name system (DNS) to resolve domain names into IP addresses and vice versa.
Basic DNS Lookup Example Code Explanation:
import dns from 'dns';
// Use the dns module to look up the IP address of a domain
dns.lookup('example.com', (err, address, family) => {
if (err) throw err;
console.log(`address: ${address}, family: IPv${family}`);
});
In the basic example:
- We perform a DNS lookup for the domain name
example.com
usingdns.lookup
. - The callback function receives potential errors, the resolved IP address, and the IP family (4 or 6).
Advanced DNS Resolve and Reverse Lookup Example Code Explanation:
import dns from 'dns';
// Use the dns module to resolve a domain into an array of IP addresses
dns.resolve4('example.com', (err, addresses) => {
if (err) throw err;
console.log(`IPv4 addresses: ${addresses.join(', ')}`);
// Perform a reverse DNS lookup for each IP address
addresses.forEach((address) => {
dns.reverse(address, (err, hostnames) => {
if (err) {
console.log(`reverse for ${address} failed: ${err.message}`);
} else {
console.log(`reverse for ${address}: ${hostnames}`);
}
});
});
});
In the advanced example:
- We use
dns.resolve4
to resolve the domainexample.com
into an array of IPv4 addresses. - For each IPv4 address, we perform a reverse DNS lookup using
dns.reverse
to find the hostname associated with the IP address. - The
resolve4
method is specific to IPv4 addresses. Similarly,resolve6
would be used for IPv6 addresses. - Reverse DNS lookups can be useful for finding the domain name associated with a given IP, often used in logging or authentication processes to verify where requests are coming from.
16. zlib:
The zlib
module provides compression functionality implemented using Gzip and Deflate/Inflate. It can be used for creating compression and decompression streams.
zlib:
The zlib
module in Node.js is used for compressing and decompressing files and data using Gzip and Deflate/Inflate algorithms. It's useful for reducing the size of data, which can improve transmission time over the network and reduce storage needs.
Basic Compression with Gzip: Here's how you can compress a string using Gzip:
import zlib from 'zlib';
const input = 'Text to be compressed using Gzip';
const buffer = Buffer.from(input);
zlib.gzip(buffer, (err, result) => {
if (err) {
console.error('An error occurred:', err);
return;
}
console.log(result); // This will log the compressed data.
});
In the example above, gzip
is used to compress a buffer containing the input string. The result is a compressed buffer, which you can then save to a file, send over a network, etc.
Basic Decompression with Gzip:
To decompress data that has been compressed using Gzip, you would use the gunzip
function:
import zlib from 'zlib';
// Assume 'compressed' is a Buffer containing Gzip-compressed data.
zlib.gunzip(compressed, (err, result) => {
if (err) {
console.error('An error occurred:', err);
return;
}
console.log(result.toString()); // This will log the original data.
});
In this decompression example, gunzip
is used to decompress the data back to its original form.
Advanced Stream Compression:
For large files or data streams, you can use the streaming capabilities of zlib
to compress data as it's being read:
import fs from 'fs';
import zlib from 'zlib';
const gzip = zlib.createGzip();
const source = fs.createReadStream('path/to/input.txt');
const destination = fs.createWriteStream('path/to/output.txt.gz');
source.pipe(gzip).pipe(destination);
This code creates a read stream from a file, pipes it through a gzip transform stream, and then pipes the output to a write stream. This efficiently compresses the file in a streaming manner, which is memory-efficient for large files.
Advanced Stream Decompression:
Similarly, you can decompress a stream of data using the createGunzip
method:
import fs from 'fs';
import zlib from 'zlib';
const gunzip = zlib.createGunzip();
const source = fs.createReadStream('path/to/input.txt.gz');
const destination = fs.createWriteStream('path/to/output.txt');
source.pipe(gunzip).pipe(destination);
Here, a read stream is created from a .gz
file, which is then piped through a gunzip transform stream to decompress it, and the decompressed data is written to a new file. This is useful for decompressing large files without loading them entirely into memory.
17. cluster:
The cluster
module allows you to create child processes that all share server ports, enabling load balancing over multiple CPU cores.
cluster:
The cluster
module in Node.js allows you to take advantage of multi-core systems, by enabling the creation of child processes (workers) that run simultaneously and share the same server port.
Basic cluster Example:
Here is a basic example of using the cluster
module to fork a new worker process for each CPU core in the system.
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers.
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`worker ${worker.process.pid} died`);
});
} else {
// Workers can share any TCP connection.
// In this case, it is an HTTP server.
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello world\n');
}).listen(8000);
console.log(`Worker ${process.pid} started`);
}
In this basic example, the cluster.isMaster
is used to determine if the process is the master process or a worker. If it is the master, it forks worker processes equal to the number of CPU cores. Each worker creates an HTTP server.
Advanced cluster Example: An advanced use case might involve graceful restarts and worker management:
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers.
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`worker ${worker.process.pid} died`);
console.log('Forking a new worker');
cluster.fork();
});
// Graceful Shutdown
process.on('SIGTERM', () => {
console.log('Master is shutting down');
for (const id in cluster.workers) {
console.log(`Shutting down worker id: ${id}`);
cluster.workers[id].kill('SIGTERM');
}
setTimeout(() => {
process.exit(0);
}, 5000);
});
} else {
// Worker processes have a HTTP server.
http.createServer((req, res) => {
res.writeHead(200);
res.end(`Hello world from worker ${process.pid}\n`);
}).listen(8000);
console.log(`Worker ${process.pid} started`);
// Worker shutdown logic
process.on('SIGTERM', () => {
console.log(`Worker ${process.pid} is shutting down`);
// Close server and exit process
server.close(() => {
process.exit(0);
});
});
}
In the advanced example, when a worker dies, the master logs the event and forks a new worker to replace it. Additionally, there's a graceful shutdown process in place: when a SIGTERM
signal is received, the master sends the signal to each worker, giving them a chance to finish their tasks before shutting down. This could be improved further with more robust handling of HTTP connections to ensure that no requests are dropped during the restart process.
18. assert:
The assert
module provides a simple set of assertion tests that can be used to test invariants. It's commonly used for writing tests.
assert:
The assert
module provides a set of assertion functions for verifying invariants, primarily for testing purposes. Assertions are a way to ensure that code behaves as expected. When an assertion fails, an error is thrown, which typically causes the test to fail.
Basic Example:
Here's how you can use the assert
module to perform basic checks:
- Use
assert.strictEqual()
to test equality. - Use
assert.deepStrictEqual()
to test deep equality of objects.
import assert from 'assert';
// Asserting strict equality
const add = (a, b) => a + b;
assert.strictEqual(add(2, 2), 4, '2 + 2 should equal 4');
// Asserting deep equality of objects
const obj1 = { name: 'John', age: 30 };
const obj2 = { name: 'John', age: 30 };
assert.deepStrictEqual(obj1, obj2, 'The objects should have the same properties and values');
In this basic example, assert.strictEqual()
checks if the two arguments are strictly equal, and assert.deepStrictEqual()
is used to compare the properties of two objects.
Advanced Example:
In more advanced scenarios, you might want to use other assertions for different conditions or to write custom error messages.
- Use
assert.ok()
to test if a value is truthy. - Use
assert.throws()
to test if a function throws an error. - Create an
AssertionError
to customize the error message and properties.
import assert, { AssertionError } from 'assert';
// Asserting that a value is truthy
const value = true;
assert.ok(value, 'The value should be truthy');
// Asserting that a function throws
const riskyFunction = () => {
throw new Error('Dangerous operation!');
};
assert.throws(
riskyFunction,
/^Error: Dangerous operation!$/,
'riskyFunction should throw an error with the expected message'
);
// Customizing the assertion error
try {
assert.strictEqual(1, 2);
} catch (err) {
if (err instanceof AssertionError) {
// You can customize the error thrown by the assertion
const customError = new AssertionError({
message: 'Custom error: 1 does not equal 2',
expected: 1,
actual: 2,
});
console.error(customError.message);
} else {
console.error('An unexpected error occurred');
}
}
In the advanced example:
assert.ok()
is used to test thatvalue
is truthy, which means it's not false, 0, empty string, null, undefined, or NaN.assert.throws()
is used to verify thatriskyFunction
throws an error that matches the expected pattern.- A custom
AssertionError
is created to provide a more descriptive error message and to specify the expected and actual values that led to the failure of the assertion.
19. vm:
The vm
module provides APIs for compiling and running code within V8 Virtual Machine contexts. It's useful for running sandboxed JavaScript code.
vm:
The vm
module in Node.js provides functionality for compiling and running code within V8 Virtual Machine contexts. It can execute JavaScript code in a sandboxed environment, which means it runs in an isolated context and has its own global variables.
Basic VM Usage Example Code Explanation:
import vm from 'vm';
// Define a JavaScript code snippet as a string
const code = 'const x = 5 + 10; x;';
// Run the code snippet in a new V8 VM context
try {
const result = vm.runInNewContext(code);
console.log(result); // Output will be 15
} catch (err) {
console.error('Failed to execute the code:', err.message);
}
In the basic example:
- We define a string
code
that contains a simple JavaScript expression. - We use
vm.runInNewContext
to execute the code in a new V8 context. This method returns the result of the last statement executed. - Since it's sandboxed, the code doesn't have access to the local scope or the Node.js environment.
Advanced VM with Custom Sandbox Example Code Explanation:
import vm from 'vm';
// Define a sandbox object that will serve as the global context for the executed code
const sandbox = {
module: {},
console: console
};
// A script that attempts to use the global console and module objects
const code = `
console.log('Hello, sandboxed world!');
module.exports = { message: 'This is defined in the sandbox.' };
`;
// Compile the code into a Script object
const script = new vm.Script(code);
// Execute the script in the context of the sandbox
script.runInNewContext(sandbox);
console.log(sandbox.module.exports.message); // Output will be 'This is defined in the sandbox.'
In the advanced example:
- We define a
sandbox
object with amodule
andconsole
. This sandbox acts as the global context for the code we're going to run. - We create a
vm.Script
object from thecode
string. This compiled script can be run multiple times in different contexts. - We run the script in the context of the
sandbox
usingscript.runInNewContext
. This method does not return the result of the script; instead, you can inspect the sandbox object to see what changes were made by the script. - The sandboxed code can use the provided
console
to log messages and can set properties on themodule
object, which we then log after the script has executed. - This approach provides a controlled environment where the executed code has limited access to the Node.js API and the script's global scope is confined to the
sandbox
object.
20. tty:
The tty
module provides classes that are used to determine whether a given stream
is a terminal or a text terminal (TTY). This can be useful for customizing the behavior of CLI applications.
26. async_hooks:
The async_hooks
module provides an API to track asynchronous resources in Node.js. It allows you to hook into the lifecycle of asynchronous resources created within a Node.js application, which includes essentially all the asynchronous operations. It can be used to monitor and manage the state throughout the lifetime of these operations and is especially useful for debugging complex asynchronous behaviors.
async_hooks:
The async_hooks
module in Node.js allows developers to track the lifecycle events of asynchronous operations. By hooking into these events, developers can monitor the creation, resolution, and destruction of asynchronous resources, which is useful for debugging and monitoring purposes.
Basic Usage of async_hooks: Below is an example of setting up async hooks to track the start and end of asynchronous operations:
import async_hooks from 'async_hooks';
import fs from 'fs';
// Target output for debugging information
const debug = (message: string) => fs.writeFileSync(1, `${message}\n`, { flag: 'a' });
// Define the hooks
const hooks = {
init(asyncId, type, triggerAsyncId) {
const message = `Async hook init: ${asyncId}, type: ${type}, trigger: ${triggerAsyncId}`;
debug(message);
},
before(asyncId) {
debug(`Async hook before: ${asyncId}`);
},
after(asyncId) {
debug(`Async hook after: ${asyncId}`);
},
destroy(asyncId) {
debug(`Async hook destroy: ${asyncId}`);
}
};
// Create the async hook instance
const asyncHook = async_hooks.createHook(hooks);
// Enable the hooks
asyncHook.enable();
// Example async operation
setTimeout(() => {
debug('Timeout callback executed');
}, 100);
In this basic example, the init
hook logs when an asynchronous operation is initialized. The before
and after
hooks log just before and after the callback of the asynchronous operation is called. The destroy
hook logs when the asynchronous operation is completed and its resources can be collected.
Advanced Usage of async_hooks with Context Tracking:
This example demonstrates how to use async_hooks
to track the context across asynchronous calls, which is a common scenario for tracing and profiling:
import async_hooks from 'async_hooks';
import fs from 'fs';
const asyncResourceMap = new Map();
const hooks = {
init(asyncId, type, triggerAsyncId) {
// Here we can bind the resource with its parent's context
const parentContext = asyncResourceMap.get(triggerAsyncId);
if (parentContext) {
asyncResourceMap.set(asyncId, parentContext);
}
},
destroy(asyncId) {
asyncResourceMap.delete(asyncId);
}
};
const asyncHook = async_hooks.createHook(hooks);
asyncHook.enable();
// Example of tracking context across asynchronous calls
function executeAsyncTask(context) {
const asyncId = async_hooks.executionAsyncId();
asyncResourceMap.set(asyncId, context);
setTimeout(() => {
// At this point, the timeout callback is considered a new execution context
const currentAsyncId = async_hooks.executionAsyncId();
const currentContext = asyncResourceMap.get(currentAsyncId);
fs.writeFileSync(1, `In timeout callback with context: ${JSON.stringify(currentContext)}\n`);
// Do some work in the context of 'context'
}, 100);
}
// Kick off the async task with a given context
executeAsyncTask({ requestId: '1234' });
In this advanced example, each asynchronous operation is associated with a context (in this case, a requestId
). When the asynchronous operation is initialized, the context of its parent is retrieved and passed down to it. This allows you to track which request or operation led to the current asynchronous operation, which is very helpful for debugging purposes.