This error occurs when attempting to write data to a Node.js stream that has been closed, ended, or destroyed. Common causes include writing after calling end(), writing to destroyed streams, or writing to inherently read-only streams.
This error indicates that your code is trying to write data to a stream that is no longer in a writable state. In Node.js, streams can transition from writable to non-writable through several mechanisms: calling end() to signal completion, calling destroy() to immediately close the stream, or the stream being inherently read-only (like process.stdin in some contexts). When a stream is ended or destroyed, Node.js sets internal flags (writable = false, destroyed = true) that prevent further write operations. Any attempt to write after these flags are set will result in this error. This is a protective mechanism to prevent data corruption and ensure proper stream lifecycle management. The error most commonly manifests as "Stream is not writable" or related errors like ERR_STREAM_WRITE_AFTER_END and ERR_STREAM_DESTROYED, depending on the specific state transition that caused the stream to become non-writable.
Always verify that a stream is writable before attempting write operations. Use the writable property to check the stream state:
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
function safeWrite(stream, data) {
if (stream.writable) {
stream.write(data);
} else {
console.error('Stream is not writable');
}
}
safeWrite(writeStream, 'Hello World\n');
writeStream.end();
// This would fail without the check
safeWrite(writeStream, 'More data'); // Logs error instead of crashingFor destroyed streams, also check the destroyed property:
function safeWrite(stream, data) {
if (stream.destroyed) {
console.error('Stream has been destroyed');
return;
}
if (!stream.writable) {
console.error('Stream is not writable');
return;
}
stream.write(data);
}Always attach error listeners to streams to catch and handle errors gracefully:
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
// Add error handler before any operations
writeStream.on('error', (err) => {
console.error('Stream error:', err.message);
// Clean up or retry logic here
});
writeStream.on('finish', () => {
console.log('Stream finished successfully');
});
// Now safe to write
writeStream.write('Data\n');
writeStream.end();For multiple streams in a pipeline, use the pipeline utility which handles errors automatically:
const { pipeline } = require('stream');
const fs = require('fs');
const zlib = require('zlib');
pipeline(
fs.createReadStream('input.txt'),
zlib.createGzip(),
fs.createWriteStream('output.txt.gz'),
(err) => {
if (err) {
console.error('Pipeline failed:', err.message);
} else {
console.log('Pipeline succeeded');
}
}
);The stream.finished() utility notifies you when a stream is no longer readable, writable, or has encountered an error:
const { finished } = require('stream');
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
finished(writeStream, (err) => {
if (err) {
console.error('Stream failed:', err.message);
} else {
console.log('Stream completed successfully');
}
});
writeStream.write('Data\n');
writeStream.end();This is particularly useful for detecting premature closes (like aborted HTTP requests):
const http = require('http');
const { finished } = require('stream');
http.createServer((req, res) => {
finished(res, (err) => {
if (err) {
console.log('Client disconnected:', err.message);
// Cleanup logic here
}
});
// Your response logic
res.write('Streaming data...\n');
}).listen(3000);If you're getting ERR_STREAM_WRITE_AFTER_END, ensure you're not calling write() after end():
Incorrect:
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('First line\n');
writeStream.end(); // Stream is now ended
// ERROR: This will throw ERR_STREAM_WRITE_AFTER_END
writeStream.write('Second line\n');Correct:
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('First line\n');
writeStream.write('Second line\n');
writeStream.end(); // End only after all writesOr pass final data to end():
writeStream.write('First line\n');
writeStream.end('Second line\n'); // Write and end in one callRace conditions can cause writes after stream closure. Use promises or careful callback ordering:
Problematic async code:
const writeStream = fs.createWriteStream('output.txt');
setTimeout(() => {
writeStream.write('Delayed write\n'); // May fail if stream ended
}, 1000);
writeStream.end(); // Called immediatelyFixed with Promise coordination:
const { promisify } = require('util');
async function safeAsyncWrite() {
const writeStream = fs.createWriteStream('output.txt');
try {
// Wait for async operation
const data = await fetchDataAsync();
// Check stream is still writable
if (writeStream.writable) {
writeStream.write(data);
writeStream.end();
// Wait for finish event
await new Promise((resolve, reject) => {
writeStream.on('finish', resolve);
writeStream.on('error', reject);
});
}
} catch (err) {
console.error('Write failed:', err);
if (!writeStream.destroyed) {
writeStream.destroy();
}
}
}When manually destroying streams, ensure no further operations are attempted:
const writeStream = fs.createWriteStream('output.txt');
function cleanup() {
if (!writeStream.destroyed) {
writeStream.destroy();
}
}
// Handle process termination
process.on('SIGINT', cleanup);
process.on('SIGTERM', cleanup);
// Check before writing
if (!writeStream.destroyed && writeStream.writable) {
writeStream.write('Data\n');
}For custom writable streams, implement _destroy properly:
const { Writable } = require('stream');
class MyWritable extends Writable {
_write(chunk, encoding, callback) {
// Your write logic
callback();
}
_destroy(err, callback) {
// Cleanup resources
console.log('Stream being destroyed');
callback(err);
}
}Stream State Management
Streams in Node.js maintain several state flags: writable (boolean indicating if write() is safe), writableEnded (true after end() is called), writableFinished (true after 'finish' event), and destroyed (true after destroy()). Understanding these flags is crucial for robust stream handling.
Backpressure Handling
When writing large amounts of data, stream.write() returns false when the internal buffer is full. Always check the return value and wait for the 'drain' event before continuing:
function writeWithBackpressure(stream, data) {
if (!stream.write(data)) {
// Buffer is full, wait for drain
stream.once('drain', () => {
console.log('Buffer drained, can write more');
});
}
}HTTP Response Streams
HTTP response objects are writable streams that can be closed by client disconnection. Always check res.headersSent and listen for 'close' events:
app.get('/stream', (req, res) => {
res.on('close', () => {
console.log('Client disconnected');
// Stop streaming
});
const interval = setInterval(() => {
if (!res.writable || res.destroyed) {
clearInterval(interval);
return;
}
res.write('data\n');
}, 1000);
});Transform Streams
Transform streams are both readable and writable. The writable side can close independently of the readable side, requiring careful state management in _transform and _flush implementations.
Stream Promises API
Node.js 15+ provides a promises-based API for streams via stream/promises, which can simplify error handling with async/await patterns.
Error: Listener already called (once event already fired)
EventEmitter listener already called with once()
Error: EACCES: permission denied, open '/root/file.txt'
EACCES: permission denied
Error: Invalid encoding specified (stream encoding not supported)
How to fix Invalid encoding error in Node.js readable streams
Error: EINVAL: invalid argument, open
EINVAL: invalid argument, open
TypeError: readableLength must be a positive integer (stream config)
TypeError: readableLength must be a positive integer in Node.js streams