This fatal error occurs when Node.js runs out of memory trying to allocate space in the JavaScript heap. It typically happens when processing large datasets, running memory-intensive operations, or when there are memory leaks that gradually consume all available heap space.
This error is thrown by the V8 JavaScript engine when Node.js attempts to allocate more memory than is available in the heap. The "CALL_AND_RETRY_LAST" message indicates that V8 has exhausted all attempts to free up memory through garbage collection and failed to satisfy the memory allocation request. Node.js has a default memory limit (approximately 512 MB for 32-bit systems and 1.4-2 GB for 64-bit systems) to prevent runaway processes from consuming all system resources. When your application exceeds this limit, the process terminates with this fatal error. The heap is where V8 stores objects, strings, and closures. The Old Space within the heap is where long-lived objects are stored, and this is the area controlled by the --max-old-space-size flag. When the Old Space fills up and garbage collection cannot free enough memory, the allocation fails and this error occurs.
The quickest solution is to increase the memory limit for Node.js. Run your script with the --max-old-space-size flag:
node --max-old-space-size=4096 your-script.jsThe value is in megabytes. Common values:
- 2048 (2 GB) - for moderate workloads
- 4096 (4 GB) - for large applications
- 8192 (8 GB) - for very large builds or data processing
For npm scripts, modify your package.json:
{
"scripts": {
"build": "node --max-old-space-size=4096 node_modules/webpack/bin/webpack.js",
"start": "node --max-old-space-size=4096 server.js"
}
}Note: On a machine with 2 GB of RAM, don't exceed 1.5 GB to leave memory for the OS and other processes.
For a persistent solution across all Node.js processes, set the NODE_OPTIONS environment variable:
Linux/macOS:
export NODE_OPTIONS="--max-old-space-size=4096"
node your-script.jsAdd to ~/.bashrc or ~/.zshrc for permanent effect:
echo 'export NODE_OPTIONS="--max-old-space-size=4096"' >> ~/.bashrcWindows (Command Prompt):
set NODE_OPTIONS=--max-old-space-size=4096
node your-script.jsWindows (PowerShell):
$env:NODE_OPTIONS="--max-old-space-size=4096"
node your-script.jsFor CI/CD environments, add NODE_OPTIONS to your environment configuration.
If increasing memory only delays the error, you likely have a memory leak. Use profiling tools to identify it:
Using Chrome DevTools:
node --inspect your-script.jsOpen chrome://inspect in Chrome, click "inspect", and use the Memory tab to take heap snapshots.
Using clinic.js:
npm install -g clinic
clinic doctor -- node your-script.jsCommon leak patterns to check:
- Event listeners not removed: Always call .removeListener() or .off()
- Timers not cleared: Use clearInterval() and clearTimeout()
- Global variables accumulating data
- Closures holding references to large objects
- Cache implementations without size limits
Instead of loading entire files into memory, use streams to process data incrementally:
Bad - Loads entire file:
const fs = require('fs');
const data = fs.readFileSync('large-file.json', 'utf8');
const parsed = JSON.parse(data); // All in memoryGood - Streams data:
const fs = require('fs');
const readline = require('readline');
const stream = fs.createReadStream('large-file.txt');
const rl = readline.createInterface({
input: stream,
crlfDelay: Infinity
});
for await (const line of rl) {
// Process one line at a time
processLine(line);
}For JSON, use streaming JSON parsers like stream-json:
const { parser } = require('stream-json');
const { streamArray } = require('stream-json/streamers/StreamArray');
fs.createReadStream('large.json')
.pipe(parser())
.pipe(streamArray())
.on('data', ({ value }) => {
// Process one object at a time
});Review your code for inefficient patterns:
Avoid creating unnecessary copies:
// Bad - creates multiple intermediate arrays
const result = data
.map(x => transform1(x))
.map(x => transform2(x))
.filter(x => x.valid);
// Better - single pass
const result = data.reduce((acc, x) => {
const t1 = transform1(x);
const t2 = transform2(t1);
if (t2.valid) acc.push(t2);
return acc;
}, []);Release large objects when done:
let largeBuffer = Buffer.alloc(100 * 1024 * 1024);
// ... use buffer ...
largeBuffer = null; // Explicitly release for garbage collectionImplement cache size limits:
const cache = new Map();
const MAX_CACHE_SIZE = 1000;
function addToCache(key, value) {
if (cache.size >= MAX_CACHE_SIZE) {
const firstKey = cache.keys().next().value;
cache.delete(firstKey);
}
cache.set(key, value);
}V8 Memory Structure:
The V8 heap consists of New Space (for short-lived objects) and Old Space (for long-lived objects). The --max-old-space-size flag specifically controls Old Space, where most memory accumulation occurs. V8 uses a generational garbage collector that moves surviving objects from New Space to Old Space.
Container Memory Limits:
In Docker or Kubernetes, ensure the container memory limit is higher than your Node.js heap size. For example, if you set --max-old-space-size=4096, the container should have at least 5-6 GB to account for V8 overhead, native modules, and buffer memory.
Node.js 20+ Improvements:
Node.js 20 and later versions have improved memory management in containers, automatically detecting cgroup memory limits. However, you may still need to explicitly set --max-old-space-size for optimal performance.
Performance vs. Memory Trade-offs:
While increasing heap size solves immediate crashes, it's not a substitute for proper memory management. Larger heaps mean longer garbage collection pauses, which can impact application responsiveness. Always profile your application to find the optimal balance.
Worker Threads:
For CPU-intensive tasks that consume large amounts of memory, consider using worker threads to distribute the load across multiple isolated heaps:
const { Worker } = require('worker_threads');
const worker = new Worker('./heavy-task.js');Each worker has its own heap, allowing you to effectively bypass the single-process memory limit.
Error: EMFILE: too many open files, watch
EMFILE: fs.watch() limit exceeded
Error: Middleware next() called multiple times (next() invoked twice)
Express middleware next() called multiple times
Error: Worker failed to initialize (worker startup error)
Worker failed to initialize in Node.js
Error: EMFILE: too many open files, open 'file.txt'
EMFILE: too many open files
Error: cluster.fork() failed (cannot create child process)
cluster.fork() failed - Cannot create child process