Node.js Event Loop: Master Non-Blocking I/O (Complete Guide)
The Problem You're Solving
Your Node.js server handles 100 requests/second. Your competitor's handles 10,000 requests/second.
Same hardware. Same code quality.
Why? Event loop optimization.
You: Blocking I/O (waitTime: 15s on 100 concurrent)
Competitor: Non-blocking I/O (waitTime: 0.3s on 10,000 concurrent)
That 14.7-second difference = crashed users, lost revenue, negative reviews.
Without understanding the event loop, you'll write blocking code that kills performance. With it, you squeeze 100x more throughput from the same server.
Event loop mastery appears in 28% of Node.js interviews and directly impacts production system scalability.
What is the Event Loop?
The event loop is Node.js's mechanism for handling asynchronous operations without blocking:
// β BLOCKING - Freezes entire server for 1 second
const data = fs.readFileSync('./file.txt'); // Waits 1s
console.log('Done');
// β
NON-BLOCKING - Continues serving other requests
fs.readFile('./file.txt', (err, data) => {
if (err) throw err;
console.log('File ready');
});
console.log('Request queued, continuing...');
The event loop allows Node.js to:
- Accept multiple requests without waiting
- Queue I/O operations (file, database, network)
- Continue processing other requests while I/O completes
- Execute callbacks when I/O finishes
Event Loop Architecture (Phases)
The event loop runs in this order repeatedly:
βββββββββββββββββββββββββββββββββ
β Timers Phase β setTimeout, setInterval
βββββββββββββββββββββββββββββββββ€
β Pending Callbacks β Deferred I/O callbacks
βββββββββββββββββββββββββββββββββ€
β Idle, Prepare β Internal (skip)
βββββββββββββββββββββββββββββββββ€
β Poll Phase β β Main work (file I/O, network)
βββββββββββββββββββββββββββββββββ€
β Check Phase β setImmediate
βββββββββββββββββββββββββββββββββ€
β Close Callbacks β Socket/connection cleanup
βββββββββββββββββββββββββββββββββ
β β
ββββββββββββββββββββββ (Repeats)
Phase 1: Timers
Executes callbacks from setTimeout() and setInterval() if their time has elapsed.
setTimeout(() => {
console.log('Timer fired');
}, 100);
console.log('Scheduled');
// Output: Scheduled β (100ms later) β Timer fired
Phase 2: Pending Callbacks
Deferred callbacks from previous iterations (rarely encountered in user code).
Phase 3: Poll Phase (β Most Important)
Where most I/O happens:
- Network requests (HTTP, database connections)
- File system operations (read, write)
- Timers that are ready
const fs = require('fs');
fs.readFile('large.txt', (err, data) => {
console.log('File ready'); // Executes in POLL phase
});
console.log('Reading...'); // Synchronous, runs first
// Output: Reading... β (file I/O completes) β File ready
Phase 4: Check Phase
Executes setImmediate() callbacks.
setImmediate(() => {
console.log('Immediate');
});
setTimeout(() => {
console.log('Timeout');
}, 0);
console.log('Start');
// Output: Start β Timeout β Immediate
// (Even though setTimeout is 0ms, it's in Timers phase before Check)
Phase 5: Close Callbacks
Cleans up closed connections.
Microtasks: Priority Processing
Microtasks run BEFORE the next phase starts:
Promise.resolve()
.then(() => console.log('Microtask'))
.catch(() => {});
setImmediate(() => console.log('Macrotask'));
setTimeout(() => console.log('Timer'), 0);
console.log('Sync');
// Output:
// Sync
// Microtask
// Timer
// Macrotask
Order of execution:
- Synchronous code
- Microtasks (Promises, queueMicrotask)
- Next event loop phase (Timers)
- Microtasks again
- Next phase (Poll)
- Microtasks again
Microtasks always execute before macrotasks (setTimeout, setImmediate).
Real-World Example: Web Server
const http = require('http');
const fs = require('fs');
const server = http.createServer((req, res) => {
// Request 1: POST /upload
// Request 2: GET /large-file
// β BLOCKING - Freezes server during large file read
const data = fs.readFileSync('./large-file.bin');
res.end(data);
// β
NON-BLOCKING - Continues processing Request 2
fs.readFile('./large-file.bin', (err, data) => {
if (err) {
res.writeHead(500);
res.end('Error');
return;
}
res.end(data);
});
});
server.listen(3000);
With blocking (β):
- Request 1 arrives β reads file (2s) β freezes server
- Request 2 waits 2s for server to unfreeze
- Total: 4 seconds for 2 requests
With non-blocking (β ):
- Request 1 arrives β queues file read
- Request 2 arrives immediately
- File read completes β callback executes
- Both done in ~2 seconds
Promise Chain vs Async/Await
Both use the microtask queue, but readability differs:
Promise Chain (Microtask-based)
function fetchUserData(userId) {
return fetch(`/api/users/${userId}`)
.then(res => res.json())
.then(user => {
console.log('User:', user);
return user;
});
}
console.log('Fetching...');
fetchUserData(1);
console.log('Request queued');
// Output:
// Fetching...
// Request queued
// User: { id: 1, name: 'Alice' }
Async/Await (Also Microtask-based)
async function fetchUserData(userId) {
const res = await fetch(`/api/users/${userId}`);
const user = await res.json();
console.log('User:', user);
return user;
}
console.log('Fetching...');
fetchUserData(1);
console.log('Request queued');
// Output: SAME as Promise chain
// Fetching...
// Request queued
// User: { id: 1, name: 'Alice' }
Both are microtasksβthey have the same execution order.
Common Mistakes
β Mistake 1: Blocking File Operations
// WRONG - Freezes entire server
const data = fs.readFileSync('./huge-file.txt');
console.log(data);
// CORRECT - Non-blocking
fs.readFile('./huge-file.txt', (err, data) => {
if (err) throw err;
console.log(data);
});
Every Sync version blocks. Avoid in production.
β Mistake 2: Misunderstanding Microtask Timing
setTimeout(() => console.log('A'), 0);
Promise.resolve().then(() => console.log('B'));
console.log('C');
// WRONG OUTPUT: A, B, C
// CORRECT OUTPUT: C, B, A (Promise is microtask, setTimeout is macrotask)
β Mistake 3: Not Handling Errors in Async Code
// WRONG - Error silently fails
fs.readFile('file.txt', (err, data) => {
console.log(data); // What if err exists?
});
// CORRECT - Check error first
fs.readFile('file.txt', (err, data) => {
if (err) {
console.error('Failed to read:', err.message);
return;
}
console.log(data);
});
Event Loop Optimization Strategies
Strategy 1: Batch I/O Operations
// β SLOW - 10 sequential database queries (10s)
async function getUsers() {
for (let i = 1; i <= 10; i++) {
const user = await db.query(`SELECT * FROM users WHERE id = ${i}`);
console.log(user);
}
}
// β
FAST - Parallel queries (1s)
async function getUsers() {
const queries = [];
for (let i = 1; i <= 10; i++) {
queries.push(db.query(`SELECT * FROM users WHERE id = ${i}`));
}
const users = await Promise.all(queries);
users.forEach(user => console.log(user));
}
Strategy 2: Defer Heavy Computation
// β BLOCKS - Calculates immediately
function processRequest(data) {
const result = expensiveCalculation(data); // 500ms
res.json(result);
}
// β
DEFERS - Calculates in background
function processRequest(data) {
res.json({ status: 'processing' });
setImmediate(() => {
const result = expensiveCalculation(data);
cache.set(data.id, result); // Store for later
});
}
Strategy 3: Use Worker Threads for CPU-Bound Work
const { Worker } = require('worker_threads');
// β BLOCKS - CPU-heavy in main thread
function calculatePrimes(n) {
let count = 0;
for (let i = 2; i < n; i++) {
if (isPrime(i)) count++;
}
return count;
}
// β
OFFLOADS - CPU work to separate thread
const worker = new Worker('./prime-worker.js');
worker.on('message', (count) => {
console.log('Primes found:', count);
});
worker.postMessage(1000000);
Event Loop Monitoring
Check Event Loop Lag
const toobusy = require('toobusy-js');
// Alert if event loop is delayed > 100ms
toobusy.onLag((ms) => {
console.warn(`Event loop lagging: ${ms}ms`);
});
// Block requests if server is overloaded
app.use((req, res, next) => {
if (toobusy()) {
res.status(503).json({ error: 'Server overloaded' });
} else {
next();
}
});
Profile Event Loop with clinic.js
npm install -g clinic
clinic doctor -- node app.js
Generates detailed report of event loop bottlenecks.
FAQ: Common Questions & Event Loop Issues
Q1: setTimeout(fn, 0) vs setImmediate(fn)?
A: setTimeout is Timers phase. setImmediate is Check phase.
setTimeout(() => console.log('timeout'), 0);
setImmediate(() => console.log('immediate'));
// Output:
// timeout (Timers phase runs first)
// immediate (Check phase runs after)
Use setImmediate for code that should run after I/O operations complete.
Q2: Why does blocking code exist at all?
A: Sometimes you NEED blocking operations during startup.
// β
OK - Startup (runs once)
const config = JSON.parse(fs.readFileSync('./config.json'));
// β WRONG - Request handler (runs per request)
app.get('/config', (req, res) => {
const config = JSON.parse(fs.readFileSync('./config.json'));
res.json(config);
});
Rule: Sync in startup, async in request handlers.
Q3: Can I manually control the event loop?
A: No, but you can optimize how you use it.
// WRONG - Trying to control event loop
process.nextTick(() => {
console.log('Next tick'); // Runs before anything else
});
// RIGHT - Understand existing order
Promise.resolve().then(() => console.log('Microtask'));
setImmediate(() => console.log('Check phase'));
setTimeout(() => console.log('Timers'), 0);
Q4: What's process.nextTick()?
A: Runs callbacks before any event loop phase.
console.log('1');
process.nextTick(() => console.log('2'));
Promise.resolve().then(() => console.log('3'));
setImmediate(() => console.log('4'));
// Output:
// 1 (sync)
// 2 (nextTick - highest priority)
// 3 (Promise microtask)
// 4 (Check phase)
Priority order:
- Sync code
- process.nextTick
- Microtasks (Promises)
- Event loop phases
Q5: How many concurrent connections can Node handle?
A: Depends on event loop efficiency, typically 10,000+.
const http = require('http');
const server = http.createServer((req, res) => {
// Simple non-blocking response
res.end('OK');
});
// Default max: 1000 concurrent connections
server.maxConnections = 50000; // Increase if needed
server.listen(3000);
Bottleneck: Not event loop, but file descriptor limits (ulimit -n).
Q6: What happens if a microtask creates more microtasks?
A: They all execute before moving to next phase (infinite loop risk!).
// β οΈ DANGEROUS - Could starve event loop
function recursiveMicrotask() {
Promise.resolve().then(() => {
console.log('Microtask');
recursiveMicrotask(); // Creates another microtask
});
}
recursiveMicrotask();
// This runs forever, other phases never execute!
Fix: Use setImmediate to break the chain.
Q7: Why does async/await sometimes feel slow?
A: Each await creates a microtask. Many awaits = many context switches.
// SLOW - 10 awaits = 10 microtask context switches
async function slow() {
const a = await fetch('/api/1');
const b = await fetch('/api/2');
const c = await fetch('/api/3');
return [a, b, c];
}
// FAST - Parallel requests, 1 await
async function fast() {
const [a, b, c] = await Promise.all([
fetch('/api/1'),
fetch('/api/2'),
fetch('/api/3'),
]);
return [a, b, c];
}
Q8: Interview Question: Design a rate limiter using the event loop.
A: Here's a production approach:
class RateLimiter {
constructor(maxRequests, windowMs) {
this.maxRequests = maxRequests;
this.windowMs = windowMs;
this.requests = new Map();
}
isAllowed(clientId) {
const now = Date.now();
const clientRequests = this.requests.get(clientId) || [];
// Remove old requests outside window
const recent = clientRequests.filter(
time => now - time < this.windowMs
);
if (recent.length >= this.maxRequests) {
return false;
}
recent.push(now);
this.requests.set(clientId, recent);
return true;
}
}
// Usage
const limiter = new RateLimiter(100, 60000); // 100 req/min
app.use((req, res, next) => {
if (!limiter.isAllowed(req.ip)) {
res.status(429).json({ error: 'Rate limited' });
} else {
next();
}
});
Interview insight: "I'd measure event loop lag with clinic.js and optimize based on actual bottlenecks."
Q9: Can I run code WITHOUT the event loop?
A: No, Node.js IS the event loop. But you can use Worker Threads to bypass it.
const { Worker } = require('worker_threads');
// Main thread: blocked by CPU work
function blockingCalculation() {
let sum = 0;
for (let i = 0; i < 1e9; i++) sum += i; // 2+ seconds
return sum;
}
// Better: Offload to worker
const worker = new Worker('./calc.js');
worker.postMessage({});
worker.on('message', (result) => {
console.log('Result:', result); // Non-blocking
});
Q10: How does the event loop scale to 10,000 concurrent requests?
A: Efficiently distributes I/O, NOT by creating threads.
Request 1: Network I/O queued (10ms)
Request 2: Network I/O queued (10ms)
...
Request 10,000: Network I/O queued (10ms)
Event Loop: While waiting for I/O:
- Executes timers
- Runs callbacks for completed I/O
- Accepts more requests
Total time: ~10ms (not 10,000 Γ 10ms)
Secret: Single thread + non-blocking I/O = high concurrency.
Conclusion
The event loop is Node.js's superpower:
- Understand phases - Timers β Poll β Check β Close
- Master microtasks - Promises run before macrotasks
- Avoid blocking - Use async for I/O, sync only at startup
- Batch operations - Promise.all for parallel I/O
- Monitor lag - Use clinic.js to identify bottlenecks
Master the event loop and you'll write servers handling 10,000+ concurrent requests on standard hardware.