This error occurs when you call resume() on a stream that is already in the reading state, indicating a mismatch between pause() and resume() calls. It typically happens with improper backpressure handling or calling resume() multiple times without corresponding pause() calls.
In Node.js streams, pause() and resume() control the flow of data. pause() stops the stream from emitting 'data' events, while resume() restarts it. This error indicates a fundamental mismatch: you're trying to resume a stream that was never properly paused, or the stream's internal state machine has become inconsistent. This usually stems from not properly tracking whether a stream is paused or reading, especially when using automatic backpressure handling or mixing manual pause/resume calls with piping.
Ensure that every pause() call has a corresponding resume() call and vice versa. Track the stream's state:
let isPaused = false;
readable.on('data', (chunk) => {
// Process chunk
if (shouldPause) {
readable.pause();
isPaused = true;
}
});
readable.on('drain', () => {
if (isPaused) {
readable.resume();
isPaused = false;
}
});Use a boolean flag to track pause state and only call resume() when the stream is actually paused.
Instead of manual pause/resume, use the backpressure pattern correctly with write operations:
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
readable.on('data', (chunk) => {
const canContinue = writable.write(chunk);
if (!canContinue) {
// Backpressure: destination buffer is full
readable.pause();
}
});
writable.on('drain', () => {
// Destination buffer is drained, resume reading
readable.resume();
});This pattern ensures pause() and resume() stay balanced based on actual write capacity.
The safest approach is to use pipe(), which handles backpressure automatically:
const fs = require('fs');
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
// pipe() handles pause/resume automatically
readable.pipe(writable);
readable.on('error', (err) => console.error('Read error:', err));
writable.on('error', (err) => console.error('Write error:', err));pipe() maintains internal state and calls pause/resume in the correct order, eliminating manual state tracking errors.
Always verify that resume() is only called on streams that were explicitly paused:
const readable = fs.createReadStream('file.txt');
// Wrong: resume() on a stream that was never paused
readable.resume(); // ❌ May cause mismatch error
// Correct: only resume after pause
readable.pause();
readable.resume(); // ✓ OK
// Or check the stream state
if (!readable.readableFlowing) {
readable.resume(); // Safe: stream is paused
}Check the readableFlowing property: if it's false, the stream is paused.
If a stream is piped, do not call pause() or resume() manually. Instead, unpipe, control manually, then repipe if needed:
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
// Piped streams handle pause/resume internally
readable.pipe(writable);
// If you need to pause, unpipe first:
function pauseReading() {
readable.unpipe(writable);
readable.pause();
}
function resumeReading() {
readable.resume();
readable.pipe(writable); // Re-establish pipe
}Mixing pipe() with manual pause/resume causes state inconsistencies.
Monitor and log the stream's internal state to diagnose pause/resume mismatches:
const readable = fs.createReadStream('file.txt');
console.log('Initial readableFlowing:', readable.readableFlowing); // null (paused mode)
readable.on('data', () => {
console.log('readableFlowing:', readable.readableFlowing); // true (flowing mode)
});
readable.pause();
console.log('After pause, readableFlowing:', readable.readableFlowing); // false
readable.resume();
console.log('After resume, readableFlowing:', readable.readableFlowing); // truereadableFlowing values:
- null: paused mode, no data events
- true: flowing mode, data events active
- false: explicitly paused
Stream States: Node.js streams have an internal readableFlowing property that controls whether the stream emits 'data' events. Manually calling pause() and resume() manipulates this state, but mixing manual control with automatic methods (like pipe()) can cause conflicts.
Backpressure Philosophy: The core issue is respecting backpressure. When write() returns false, the destination's buffer is full. You must pause the source to prevent data loss. When 'drain' fires, the buffer is ready, so resume the source. Breaking this contract causes state mismatches.
Historical Context: Early Node.js versions had bugs where internal pause() calls during piping didn't have corresponding resume() calls, leaving streams permanently paused. Modern versions are more robust, but manual pause/resume calls can still cause issues if not carefully balanced.
Readable vs. Writable: pause() and resume() only exist on Readable streams. Writable streams don't have these methods; they control backpressure via write() return values and 'drain' events.
No Automatic Resume on Error: If pause() is called and then an error occurs, the stream won't auto-resume. You must explicitly resume after error handling if appropriate.
Error: EMFILE: too many open files, watch
EMFILE: fs.watch() limit exceeded
Error: Middleware next() called multiple times (next() invoked twice)
Express middleware next() called multiple times
Error: Worker failed to initialize (worker startup error)
Worker failed to initialize in Node.js
Error: EMFILE: too many open files, open 'file.txt'
EMFILE: too many open files
Error: cluster.fork() failed (cannot create child process)
cluster.fork() failed - Cannot create child process