When using clusters in Node.js, it's important to understand how clustering works and whether or not a load balancer is necessary. Clustering allows you to create multiple server instances that share the same port, which can be beneficial for utilizing multi-core systems. However, this does not inherently provide any load balancing capabilities on its own.
When you use cluster module in Node.js, it forks several child processes to handle incoming connections. Each fork listens on the same port and can receive requests for different parts of your application. Here's a basic example:
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Fork workers.
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`Worker ${worker.process.pid} died`);
});
} else {
// Workers can share any TCP connection
// Here, it is an HTTP server
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello World\n');
}).listen(8000);
console.log(`Worker ${process.pid} started`);
}Without a load balancer, each fork will handle requests independently. If you want to balance the load between these forks, you can implement custom logic or use an external load balancer. Here are some strategies:
- Round-Robin: Distribute incoming connections in a round-robin fashion across the available workers.
- Least Connections: Direct new connections to the worker with the fewest active connections.
- Consistent Hashing: Use consistent hashing to route requests to specific workers based on a hash of the request data.
- Weighted Round-Robin or Weighted Least Connections: Assign weights to different workers and distribute connections according to these weights.
- External Load Balancer: Use an external service like Nginx, HAProxy, or AWS ELB to distribute traffic across your Node.js clusters. This is a common practice in production environments.
Here's a simple example using the round-robin strategy with cluster module:
const cluster = require('cluster');
const http = require('http');
const numCPUs = require('os').cpus().length;
if (cluster.isMaster) {
console.log(`Master ${process.pid} is running`);
// Round-robin strategy
let currentIndex = 0;
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker, code, signal) => {
console.log(`Worker ${worker.process.pid} died`);
// Optional: Bring up a new worker to replace the dead one
if (!cluster.workers[currentIndex]) {
currentIndex = 0;
}
cluster.fork();
});
setInterval(() => {
const workers = Object.values(cluster.workers);
const worker = workers[currentIndex % workers.length];
worker.send('ping');
currentIndex++;
}, 1000);
} else {
http.createServer((req, res) => {
res.writeHead(200);
res.end('Hello World\n');
}).listen(8000);
console.log(`Worker ${process.pid} started`);
process.on('message', (msg) => {
if (msg === 'ping') {
// Optional: Respond to the master for health checks or other signals
process.send('pong');
}
});
}While Node.js clusters allow multiple processes to share a port, they do not inherently provide load balancing. To effectively distribute incoming requests across multiple workers, you can implement custom strategies like round-robin or use an external load balancer for better control and performance.


Number of CPU as index for fork.
Result:
Master 1238000 is running
Worker 1238030 started on CPU 8
Worker 1238028 started on CPU 7
Worker 1238010 started on CPU 3
Worker 1238012 started on CPU 5