This error occurs when attempting to allocate a buffer that exceeds Node.js maximum buffer size limits. On 32-bit systems the limit is ~1GB, while 64-bit systems allow up to ~2GB.
The ERANGE error with "result too large" message indicates that Node.js attempted to allocate a Buffer object that exceeds the maximum allowed size for the current architecture. Buffers in Node.js have hard limits: on 32-bit architectures, the maximum is (2^30)-1 bytes (~1GB), and on 64-bit architectures, it's (2^31)-1 bytes (~2GB). This error commonly occurs when using fs.readFile() to load large files into memory, as this method loads the entire file contents into a single Buffer. Even on 64-bit systems, fs.readFile() has a documented maximum file size of 2GB. Operations like Buffer.concat() or creating large ArrayBuffers can also trigger this error when the resulting size exceeds platform limits. Buffers exist outside of V8's JavaScript heap memory, but they still have architectural constraints based on how memory addresses are handled. This error is different from heap exhaustion errors and specifically relates to buffer allocation limits at the operating system level.
Verify if you're running 32-bit or 64-bit Node.js, as this determines your maximum buffer size:
node -p "process.arch"If this returns x86 or ia32, you're running 32-bit Node.js with a ~1GB limit. Consider upgrading to 64-bit Node.js to support buffers up to ~2GB.
Instead of loading entire files into memory with fs.readFile(), use streams to process data in chunks:
const fs = require('fs');
// Instead of this (will fail on large files):
// fs.readFile('large-file.dat', (err, data) => { ... });
// Use streams:
const stream = fs.createReadStream('large-file.dat', {
highWaterMark: 16 * 1024 // 16KB chunks
});
stream.on('data', (chunk) => {
// Process each chunk
console.log(`Received ${chunk.length} bytes`);
});
stream.on('end', () => {
console.log('Finished processing file');
});
stream.on('error', (err) => {
console.error('Stream error:', err);
});This approach processes files of any size without hitting buffer limits.
For operations like Buffer.concat(), process data in manageable chunks rather than all at once:
function processLargeBuffers(buffers) {
const CHUNK_SIZE = 100 * 1024 * 1024; // 100MB chunks
const results = [];
let currentChunk = [];
let currentSize = 0;
for (const buffer of buffers) {
if (currentSize + buffer.length > CHUNK_SIZE) {
// Process current chunk
results.push(processChunk(Buffer.concat(currentChunk)));
currentChunk = [buffer];
currentSize = buffer.length;
} else {
currentChunk.push(buffer);
currentSize += buffer.length;
}
}
// Process remaining chunk
if (currentChunk.length > 0) {
results.push(processChunk(Buffer.concat(currentChunk)));
}
return results;
}Wrap buffer allocation in try-catch blocks to handle ERANGE errors gracefully:
function safeBufferAllocation(size) {
const MAX_BUFFER_SIZE = 2147483647; // ~2GB
if (size > MAX_BUFFER_SIZE) {
throw new Error(`Cannot allocate buffer of size ${size}. Maximum is ${MAX_BUFFER_SIZE} bytes.`);
}
try {
return Buffer.alloc(size);
} catch (error) {
if (error.code === 'ERR_BUFFER_TOO_LARGE' || error.message.includes('ERANGE')) {
console.error(`Buffer allocation failed: size ${size} exceeds platform limit`);
// Fall back to streaming or chunked approach
return null;
}
throw error;
}
}Buffer vs Heap Memory: Buffers in Node.js exist outside of V8's JavaScript heap, which means increasing heap size with --max-old-space-size won't help with this error. Buffer limits are architectural constraints at the OS level.
fs.readFile Size Limits: Even in newer Node.js versions where buffer limits have increased, fs.readFile() maintains a hard cap at 2GB. For files approaching or exceeding this size, streams are not just recommended but required.
Worker Threads: For parallel processing of large datasets, consider using worker threads to distribute buffer operations across multiple processes, each with its own buffer limits.
Memory-Mapped Files: For very large file operations, consider using memory-mapped files through native addons or packages like 'mmap-io', which can access files larger than buffer limits without loading them entirely into memory.
Architecture Considerations: In containerized environments, ensure your Node.js runtime is 64-bit even if the host OS is 64-bit, as container images may default to 32-bit builds.
Error: EMFILE: too many open files, watch
EMFILE: fs.watch() limit exceeded
Error: Middleware next() called multiple times (next() invoked twice)
Express middleware next() called multiple times
Error: Worker failed to initialize (worker startup error)
Worker failed to initialize in Node.js
Error: EMFILE: too many open files, open 'file.txt'
EMFILE: too many open files
Error: cluster.fork() failed (cannot create child process)
cluster.fork() failed - Cannot create child process