This error occurs when Node.js runs out of memory while executing your application. By default, Node.js limits heap memory to approximately 1.5GB on 64-bit systems, which may be insufficient for memory-intensive operations like processing large files or building complex applications.
The "FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory" error indicates that your Node.js application has exhausted the available heap memory allocated to the V8 JavaScript engine. When this happens, V8 cannot allocate more memory for objects, arrays, or other data structures, causing the process to crash. Node.js uses V8's garbage-collected heap to store objects and data. By default, V8 limits the old space (where long-lived objects reside) to around 1.5GB on 64-bit systems and 512MB on 32-bit systems. This conservative default prevents Node.js from consuming all system memory, but it can be too restrictive for memory-intensive workloads. The error typically appears during operations that require substantial memory allocation, such as building large applications, processing massive datasets, running memory-intensive tests, or performing complex transformations on large files.
The quickest fix is to run your Node.js application with increased memory:
node --max-old-space-size=4096 app.jsThis allocates 4GB of memory (4096 MB). Adjust the value based on your system's available RAM:
- 2GB: --max-old-space-size=2048
- 4GB: --max-old-space-size=4096
- 6GB: --max-old-space-size=6144
- 8GB: --max-old-space-size=8192
For npm scripts, update your package.json:
{
"scripts": {
"build": "node --max-old-space-size=4096 node_modules/.bin/next build",
"test": "node --max-old-space-size=4096 node_modules/.bin/jest"
}
}Important: Leave some memory for the operating system. On a machine with 2GB RAM, use at most 1.5GB (--max-old-space-size=1536) to avoid system instability.
To avoid repeating the flag for every command, set the NODE_OPTIONS environment variable:
Linux/macOS (temporary, current session):
export NODE_OPTIONS="--max-old-space-size=4096"
node app.jsLinux/macOS (permanent, add to ~/.bashrc or ~/.zshrc):
echo 'export NODE_OPTIONS="--max-old-space-size=4096"' >> ~/.bashrc
source ~/.bashrcWindows (temporary, current session):
set NODE_OPTIONS=--max-old-space-size=4096
node app.jsWindows PowerShell (temporary):
$env:NODE_OPTIONS="--max-old-space-size=4096"
node app.jsIn .env file (for development):
NODE_OPTIONS="--max-old-space-size=4096"This approach applies the memory limit to all Node.js processes without modifying individual commands.
If you're processing large files, avoid loading entire files into memory. Use streams instead:
Bad (loads entire file into memory):
const fs = require('fs');
const data = fs.readFileSync('large-file.json', 'utf8');
const parsed = JSON.parse(data); // Can cause memory errorsGood (processes data in chunks):
const fs = require('fs');
const readline = require('readline');
const readStream = fs.createReadStream('large-file.txt');
const rl = readline.createInterface({
input: readStream,
crlfDelay: Infinity
});
rl.on('line', (line) => {
// Process each line individually
console.log(line);
});For JSON processing, use streaming JSON parsers like stream-json:
const { parser } = require('stream-json');
const { streamArray } = require('stream-json/streamers/StreamArray');
const pipeline = fs.createReadStream('large-array.json')
.pipe(parser())
.pipe(streamArray());
pipeline.on('data', ({ value }) => {
// Process each array element individually
console.log(value);
});If increasing memory only delays the error, you likely have a memory leak. Use Node.js built-in tools to diagnose:
Monitor memory usage:
setInterval(() => {
const usage = process.memoryUsage();
console.log(`Heap Used: ${Math.round(usage.heapUsed / 1024 / 1024)} MB`);
console.log(`Heap Total: ${Math.round(usage.heapTotal / 1024 / 1024)} MB`);
}, 5000);Generate heap snapshots:
node --inspect app.jsOpen Chrome DevTools (chrome://inspect), take heap snapshots, and compare them to identify objects that aren't being garbage collected.
Common memory leak causes:
- Event listeners not removed when no longer needed
- Global variables accumulating data
- Closures retaining references to large objects
- Caches without size limits or expiration
- Timers (setInterval/setTimeout) not cleared
Fix example (clearing event listeners):
// Bad
events.forEach(event => {
eventEmitter.on(event, handler);
});
// Good
events.forEach(event => {
eventEmitter.on(event, handler);
});
// Later, when done:
events.forEach(event => {
eventEmitter.removeListener(event, handler);
});Container Memory Management (Node.js 20+):
Starting with Node.js 20, the heap size is container-aware and automatically adjusts based on cgroup memory limits. In containerized environments like Docker or Kubernetes, Node.js detects available memory and sets sensible defaults. However, you can still override this with --max-old-space-size.
Memory Sizing Formula:
A safe starting point is 75% of available container memory: 0.75 * memory_limit. For example, in a 2GB container, use --max-old-space-size=1536 (1.5GB). Monitor your application and adjust based on actual memory usage patterns.
Node.js 24+ Percentage-Based Sizing:
Node.js 24 introduces --max-old-space-size-percentage for easier container configuration:
node --max-old-space-size-percentage=75 app.jsProduction Deployment Considerations:
- Monitor memory usage with tools like PM2, Prometheus, or Datadog
- Set container memory limits slightly higher than --max-old-space-size to account for Node.js overhead
- Configure health checks to restart processes that exceed memory thresholds
- Use horizontal scaling (multiple processes) instead of just increasing memory limits
- Profile memory usage during load testing before production deployment
Garbage Collection Tuning:
For applications with specific memory patterns, you can tune garbage collection:
# Expose GC metrics
node --max-old-space-size=4096 --expose-gc app.js
# Manual GC trigger (for testing only)
if (global.gc) {
global.gc();
}Memory Profiling Tools:
- Chrome DevTools: Best for inspecting heap snapshots and memory timelines
- node --inspect: Built-in debugging with DevTools integration
- clinic.js: Comprehensive performance and memory profiling
- heapdump: Generate V8 heap dumps programmatically for analysis
Error: EMFILE: too many open files, watch
EMFILE: fs.watch() limit exceeded
Error: Middleware next() called multiple times (next() invoked twice)
Express middleware next() called multiple times
Error: Worker failed to initialize (worker startup error)
Worker failed to initialize in Node.js
Error: EMFILE: too many open files, open 'file.txt'
EMFILE: too many open files
Error: cluster.fork() failed (cannot create child process)
cluster.fork() failed - Cannot create child process