This warning appears when code uses the deprecated util.pump() function for piping streams. Node.js deprecated this function in favor of the more robust stream.pipe() method and the modern stream.pipeline() function, which provide better error handling and automatic cleanup.
This deprecation warning indicates that your code (or a dependency) is using the old util.pump() function, which was deprecated when Node.js introduced the Stream class and the pipe() method in version 0.4.0 (2011). The util.pump() function was the original utility for controlling data flow between streams before the modern streaming APIs were established. The util.pump() function behaved as a parent or authoritative structure that distributed data from one stream to another. In contrast, the modern stream.pipe() method is extended by the source stream and allows data to cascade downstream more naturally. However, basic pipe() has its own limitations—it doesn't destroy source streams if the destination emits close or an error, and it doesn't provide callbacks for completion. For modern Node.js development (v10+), the recommended approach is to use stream.pipeline(), which addresses the limitations of both util.pump() and basic pipe() by providing comprehensive error handling, automatic cleanup, and completion callbacks.
Run your Node.js application with the trace-deprecation flag to see the full stack trace:
node --trace-deprecation your-app.jsThis will reveal whether the warning comes from your code or from a dependency in node_modules. Look for file paths to determine the source.
For simple stream piping without complex error handling, replace util.pump() with the pipe() method:
// ❌ Deprecated
const util = require('util');
util.pump(readableStream, writableStream);
// ✅ Correct - using pipe()
readableStream.pipe(writableStream);
// Example: piping file streams
const fs = require('fs');
fs.createReadStream('input.txt')
.pipe(fs.createWriteStream('output.txt'));Note that pipe() returns the destination stream, allowing you to chain multiple pipes together.
For production code requiring proper error handling and cleanup, use the modern pipeline() function:
const { pipeline } = require('stream');
const fs = require('fs');
// ✅ Best practice - pipeline with error handling
pipeline(
fs.createReadStream('input.txt'),
transformStream,
fs.createWriteStream('output.txt'),
(err) => {
if (err) {
console.error('Pipeline failed:', err);
} else {
console.log('Pipeline succeeded');
}
}
);Using promises with pipeline (Node.js 15+):
const { pipeline } = require('stream/promises');
try {
await pipeline(
fs.createReadStream('input.txt'),
transformStream,
fs.createWriteStream('output.txt')
);
console.log('Pipeline succeeded');
} catch (err) {
console.error('Pipeline failed:', err);
}When using pipe(), you need to manually handle errors and cleanup. Pipeline does this automatically:
// ❌ Basic pipe - no error handling or cleanup
readableStream.pipe(writableStream);
// ⚠️ Manual error handling with pipe
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
readable.on('error', (err) => {
console.error('Read error:', err);
writable.destroy();
});
writable.on('error', (err) => {
console.error('Write error:', err);
readable.destroy();
});
readable.pipe(writable);
// ✅ Pipeline handles this automatically
pipeline(
fs.createReadStream('input.txt'),
fs.createWriteStream('output.txt'),
(err) => {
if (err) console.error('Pipeline failed:', err);
}
);Pipeline automatically destroys all streams if any of them close or error.
If the warning originates from a dependency, update it to a newer version:
# Identify the package from the stack trace
# Update the specific package
npm update package-name
# Check for all outdated packages
npm outdated
# Update all packages
npm updateIf the package is no longer maintained, consider using the standalone pump module as a temporary solution:
npm install pumpconst pump = require('pump');
pump(stream1, stream2, stream3, (err) => {
if (err) console.error('Pump failed:', err);
});However, migrating to pipeline() is the recommended long-term solution.
Historical context: The util.pump() function was deprecated over a decade ago when Node.js v0.4.0 introduced the Stream class with the pipe() method in 2011. The function has remained in the codebase for backward compatibility but has been flagged for removal. Any code still using util.pump() is likely extremely outdated and may have other compatibility issues with modern Node.js versions.
Why pipeline() is superior: Unlike basic pipe(), the pipeline() function addresses several critical issues:
1. Automatic cleanup: Destroys all streams if any of them error or close
2. Error propagation: Errors from any stream are caught and passed to the callback
3. Backpressure handling: Properly manages flow control to prevent memory issues
4. Completion notification: Provides a callback or promise that fires when the pipeline completes or fails
Performance considerations: The performance difference between pipe() and pipeline() is negligible for most use cases. Pipeline adds minimal overhead for the significant reliability benefits it provides. In high-throughput scenarios (thousands of concurrent streams), always profile before optimizing.
Using the standalone pump module: If you need to support older Node.js versions (8.x or earlier) before pipeline() was introduced, the standalone npm package "pump" provides similar functionality:
const pump = require('pump');
const fs = require('fs');
pump(
fs.createReadStream('input.txt'),
transformStream,
fs.createWriteStream('output.txt'),
(err) => {
if (err) return console.error('Stream error:', err);
console.log('Streams finished');
}
);However, if you're on Node.js 10 or later, use the built-in pipeline() instead.
ESM vs CommonJS: In ES modules, import stream functions like this:
import { pipeline } from 'stream';
import { createReadStream, createWriteStream } from 'fs';
// Or with promises
import { pipeline } from 'stream/promises';Chaining multiple transformations: Pipeline excels at chaining multiple transform streams:
const { pipeline } = require('stream');
const { createReadStream, createWriteStream } = require('fs');
const { createGzip } = require('zlib');
pipeline(
createReadStream('input.txt'),
createGzip(),
createWriteStream('input.txt.gz'),
(err) => {
if (err) console.error('Compression failed:', err);
else console.log('File compressed successfully');
}
);Error: EMFILE: too many open files, watch
EMFILE: fs.watch() limit exceeded
Error: Listener already called (once event already fired)
EventEmitter listener already called with once()
Error: Middleware next() called multiple times (next() invoked twice)
Express middleware next() called multiple times
Error: Worker failed to initialize (worker startup error)
Worker failed to initialize in Node.js
Error: EMFILE: too many open files, open 'file.txt'
EMFILE: too many open files