This Firebase Authentication error occurs when you attempt to import more than 1,000 users in a single batch operation using the Admin SDK's importUsers() API. Firebase limits each import request to 1,000 users for performance and stability. The solution is to split your user import into smaller batches of 1,000 users or fewer.
The "auth/maximum-user-count-exceeded" error in Firebase Authentication indicates that your bulk user import operation has exceeded Firebase's per-request limit. The Firebase Admin SDK's `importUsers()` API is designed to handle large-scale user migrations efficiently, but it enforces a strict limit of 1,000 users per API call. This error typically appears during: 1. Migrating users from another authentication system to Firebase 2. Importing users from CSV files or database exports 3. Bulk user creation scripts that attempt to process too many users at once 4. Automated migration tools that don't properly batch requests Firebase enforces this limit to ensure: - Reliable API performance and stability - Reasonable request/response sizes - Protection against accidental or malicious overload - Fair resource allocation across all Firebase projects The limit is separate from Firebase's account upload quota (3.6M accounts/minute per project), which governs the overall rate at which you can import users across multiple batches.
Firebase Authentication has two key limits for user imports:
Per-Request Limit:
- Maximum 1,000 users per single importUsers() API call
- This is a hard limit enforced by Firebase
- Exceeding this limit triggers the "auth/maximum-user-count-exceeded" error
Per-Project Quota:
- 3.6M account uploads per minute per project
- This is the overall throughput limit
- Allows approximately 60 batches of 1,000 users per minute
Important characteristics:
- User import operations are optimized for speed
- Firebase does NOT check for duplicate fields during import
- Importing a user with duplicate uid will replace the existing user
- Duplicate emails or phone numbers create additional users (no validation)
- The API attempts to upload all users even if some fail
- Results include summary of successful/failed imports with error details per user
Planning your migration:
If you have 50,000 users to import:
- You need 50 batches (50,000 ÷ 1,000)
- At 1 batch per second, this takes ~50 seconds
- Well within the 3.6M/minute quota
- Add delays between batches to avoid hitting rate limits
Split your user import into batches of 1,000 users or fewer:
Basic batching implementation:
const admin = require('firebase-admin');
// Initialize Firebase Admin SDK
admin.initializeApp({
credential: admin.credential.applicationDefault(),
});
async function importUsersInBatches(users) {
const BATCH_SIZE = 1000; // Firebase's maximum per request
const totalUsers = users.length;
let importedCount = 0;
let failedCount = 0;
const errors = [];
console.log(`Starting import of ${totalUsers} users...`);
// Process users in batches
for (let i = 0; i < totalUsers; i += BATCH_SIZE) {
const batch = users.slice(i, i + BATCH_SIZE);
const batchNumber = Math.floor(i / BATCH_SIZE) + 1;
const totalBatches = Math.ceil(totalUsers / BATCH_SIZE);
console.log(`Processing batch ${batchNumber}/${totalBatches} (${batch.length} users)...`);
try {
// Import this batch
const result = await admin.auth().importUsers(batch);
// Track results
importedCount += result.successCount;
failedCount += result.failureCount;
// Log any per-user errors
if (result.errors && result.errors.length > 0) {
result.errors.forEach((error) => {
errors.push({
batch: batchNumber,
index: error.index,
error: error.error,
});
console.error(
` Failed to import user at index ${error.index}: ${error.error.code} - ${error.error.message}`
);
});
}
console.log(` Batch ${batchNumber}: ${result.successCount} succeeded, ${result.failureCount} failed`);
// Optional: Add delay between batches to avoid rate limits
if (i + BATCH_SIZE < totalUsers) {
await new Promise(resolve => setTimeout(resolve, 100)); // 100ms delay
}
} catch (error) {
console.error(`Batch ${batchNumber} failed completely:`, error.code, error.message);
// If we hit maximum-user-count-exceeded, this batch is too large
if (error.code === 'auth/maximum-user-count-exceeded') {
console.error('Batch size exceeds 1,000 users - check your batching logic!');
throw error; // Stop processing
}
// For other errors, continue with next batch
failedCount += batch.length;
}
}
console.log(`
Import complete:`);
console.log(` Total users: ${totalUsers}`);
console.log(` Successfully imported: ${importedCount}`);
console.log(` Failed: ${failedCount}`);
console.log(` Total batches: ${Math.ceil(totalUsers / BATCH_SIZE)}`);
return {
totalUsers,
importedCount,
failedCount,
errors,
};
}
// Example: Load users from JSON file
const fs = require('fs');
const users = JSON.parse(fs.readFileSync('users.json', 'utf8'));
// Convert to Firebase user format
const firebaseUsers = users.map(user => ({
uid: user.id,
email: user.email,
emailVerified: user.emailVerified || false,
displayName: user.displayName,
photoURL: user.photoURL,
disabled: user.disabled || false,
}));
// Import users
importUsersInBatches(firebaseUsers)
.then(results => {
console.log('Import results:', results);
process.exit(0);
})
.catch(error => {
console.error('Import failed:', error);
process.exit(1);
});For faster imports, process multiple batches concurrently while respecting rate limits:
const admin = require('firebase-admin');
const pLimit = require('p-limit'); // npm install p-limit
admin.initializeApp({
credential: admin.credential.applicationDefault(),
});
async function importUsersWithConcurrency(users) {
const BATCH_SIZE = 1000;
const CONCURRENT_BATCHES = 5; // Process 5 batches simultaneously
const totalUsers = users.length;
const limit = pLimit(CONCURRENT_BATCHES);
console.log(`Importing ${totalUsers} users with ${CONCURRENT_BATCHES} concurrent batches...`);
// Split users into batches
const batches = [];
for (let i = 0; i < totalUsers; i += BATCH_SIZE) {
batches.push({
users: users.slice(i, i + BATCH_SIZE),
batchNumber: Math.floor(i / BATCH_SIZE) + 1,
});
}
console.log(`Created ${batches.length} batches`);
// Process batches with concurrency limit
const results = await Promise.all(
batches.map(({ users: batchUsers, batchNumber }) =>
limit(async () => {
const startTime = Date.now();
try {
const result = await admin.auth().importUsers(batchUsers);
const duration = Date.now() - startTime;
console.log(
`Batch ${batchNumber}/${batches.length}: ${result.successCount} succeeded, ${result.failureCount} failed (${duration}ms)`
);
return {
batchNumber,
successCount: result.successCount,
failureCount: result.failureCount,
errors: result.errors || [],
duration,
};
} catch (error) {
console.error(`Batch ${batchNumber} failed:`, error.code, error.message);
return {
batchNumber,
successCount: 0,
failureCount: batchUsers.length,
errors: [{ error: { code: error.code, message: error.message } }],
duration: Date.now() - startTime,
};
}
})
)
);
// Aggregate results
const summary = results.reduce(
(acc, result) => ({
totalImported: acc.totalImported + result.successCount,
totalFailed: acc.totalFailed + result.failureCount,
totalBatches: acc.totalBatches + 1,
avgDuration: acc.avgDuration + result.duration,
errors: [...acc.errors, ...result.errors],
}),
{ totalImported: 0, totalFailed: 0, totalBatches: 0, avgDuration: 0, errors: [] }
);
summary.avgDuration = Math.round(summary.avgDuration / summary.totalBatches);
console.log(`
Import Summary:`);
console.log(` Total users: ${totalUsers}`);
console.log(` Successfully imported: ${summary.totalImported}`);
console.log(` Failed: ${summary.totalFailed}`);
console.log(` Total batches: ${summary.totalBatches}`);
console.log(` Average batch duration: ${summary.avgDuration}ms`);
console.log(` Total errors: ${summary.errors.length}`);
return summary;
}
// Example usage
const users = require('./users.json');
importUsersWithConcurrency(users)
.then(summary => {
if (summary.totalFailed > 0) {
console.log(`
Writing ${summary.errors.length} errors to error.log`);
require('fs').writeFileSync(
'import-errors.json',
JSON.stringify(summary.errors, null, 2)
);
}
process.exit(summary.totalFailed > 0 ? 1 : 0);
})
.catch(error => {
console.error('Import failed:', error);
process.exit(1);
});Rate limiting considerations:
- 3.6M users/minute = 60,000 users/second
- With 5 concurrent batches of 1,000 users, you import 5,000 users per round
- Each round takes ~1-2 seconds
- This rate is well within Firebase quotas
- Adjust CONCURRENT_BATCHES based on your project's quota and needs
When importing users with passwords, you need to specify the hashing algorithm:
const admin = require('firebase-admin');
async function importUsersWithPasswords(users) {
const BATCH_SIZE = 1000;
// Prepare users with password hashes
const firebaseUsers = users.map(user => ({
uid: user.id,
email: user.email,
emailVerified: user.emailVerified,
displayName: user.displayName,
photoURL: user.photoURL,
disabled: false,
// Password hash from your existing system
passwordHash: Buffer.from(user.passwordHash, 'base64'),
passwordSalt: Buffer.from(user.passwordSalt, 'base64'),
}));
// Define password hashing options
const options = {
hash: {
algorithm: 'BCRYPT', // or 'SCRYPT', 'PBKDF2_SHA256', 'STANDARD_SCRYPT', etc.
// Additional parameters depend on algorithm:
// For BCRYPT: no additional parameters needed
// For SCRYPT: rounds, memoryCost, etc.
},
};
// Import in batches
for (let i = 0; i < firebaseUsers.length; i += BATCH_SIZE) {
const batch = firebaseUsers.slice(i, i + BATCH_SIZE);
try {
const result = await admin.auth().importUsers(batch, options);
console.log(`Batch imported: ${result.successCount} succeeded, ${result.failureCount} failed`);
// Handle errors
if (result.errors && result.errors.length > 0) {
result.errors.forEach(error => {
console.error(`User ${batch[error.index].email} failed: ${error.error.message}`);
});
}
} catch (error) {
if (error.code === 'auth/maximum-user-count-exceeded') {
console.error('Batch exceeds 1,000 user limit!');
throw error;
}
console.error('Batch import failed:', error.message);
}
// Small delay between batches
await new Promise(resolve => setTimeout(resolve, 100));
}
}
// Example: Import from bcrypt-hashed passwords
const usersWithBcrypt = [
{
id: 'user1',
email: '[email protected]',
emailVerified: true,
displayName: 'User One',
photoURL: null,
passwordHash: 'JDJhJDEwJC4uLg==', // Base64-encoded bcrypt hash
passwordSalt: '', // Bcrypt includes salt in hash
},
// ... more users
];
importUsersWithPasswords(usersWithBcrypt)
.then(() => console.log('Import complete'))
.catch(error => console.error('Import failed:', error));Supported hashing algorithms:
- BCRYPT
- SCRYPT
- PBKDF2_SHA256
- PBKDF2_SHA512
- STANDARD_SCRYPT
- HMAC_SHA256
- HMAC_SHA512
- MD5
- SHA1
- SHA256
- SHA512
Important: Make sure your password hashes are properly Base64-encoded before passing to Firebase.
Track import progress and implement retry logic for failed batches:
const admin = require('firebase-admin');
const fs = require('fs');
class UserImporter {
constructor(users) {
this.users = users;
this.BATCH_SIZE = 1000;
this.MAX_RETRIES = 3;
this.progressFile = 'import-progress.json';
this.loadProgress();
}
loadProgress() {
if (fs.existsSync(this.progressFile)) {
this.progress = JSON.parse(fs.readFileSync(this.progressFile, 'utf8'));
console.log(`Resuming from batch ${this.progress.lastCompletedBatch + 1}`);
} else {
this.progress = {
lastCompletedBatch: -1,
totalImported: 0,
totalFailed: 0,
failedBatches: [],
};
}
}
saveProgress() {
fs.writeFileSync(this.progressFile, JSON.stringify(this.progress, null, 2));
}
async importBatch(batch, batchNumber, retryCount = 0) {
try {
const result = await admin.auth().importUsers(batch);
console.log(
`Batch ${batchNumber}: ${result.successCount} succeeded, ${result.failureCount} failed`
);
// Log individual errors
if (result.errors && result.errors.length > 0) {
const errorLog = {
batch: batchNumber,
timestamp: new Date().toISOString(),
errors: result.errors.map(err => ({
index: err.index,
uid: batch[err.index].uid,
email: batch[err.index].email,
errorCode: err.error.code,
errorMessage: err.error.message,
})),
};
// Append to error log file
fs.appendFileSync(
'import-errors.jsonl',
JSON.stringify(errorLog) + '\n'
);
}
return result;
} catch (error) {
if (error.code === 'auth/maximum-user-count-exceeded') {
console.error(`Batch ${batchNumber} exceeds 1,000 user limit - this should not happen!`);
throw error;
}
// Retry on transient errors
if (retryCount < this.MAX_RETRIES) {
const delay = Math.pow(2, retryCount) * 1000; // Exponential backoff
console.log(`Batch ${batchNumber} failed, retrying in ${delay}ms... (attempt ${retryCount + 1}/${this.MAX_RETRIES})`);
await new Promise(resolve => setTimeout(resolve, delay));
return this.importBatch(batch, batchNumber, retryCount + 1);
}
console.error(`Batch ${batchNumber} failed after ${this.MAX_RETRIES} retries:`, error.message);
// Mark entire batch as failed
return {
successCount: 0,
failureCount: batch.length,
errors: [],
};
}
}
async import() {
const totalBatches = Math.ceil(this.users.length / this.BATCH_SIZE);
const startBatch = this.progress.lastCompletedBatch + 1;
console.log(`Importing ${this.users.length} users in ${totalBatches} batches...`);
console.log(`Starting from batch ${startBatch + 1}`);
for (let i = startBatch; i < totalBatches; i++) {
const startIndex = i * this.BATCH_SIZE;
const batch = this.users.slice(startIndex, startIndex + this.BATCH_SIZE);
const batchNumber = i + 1;
console.log(`
Processing batch ${batchNumber}/${totalBatches} (${batch.length} users)...`);
const result = await this.importBatch(batch, batchNumber);
// Update progress
this.progress.totalImported += result.successCount;
this.progress.totalFailed += result.failureCount;
this.progress.lastCompletedBatch = i;
if (result.failureCount > 0) {
this.progress.failedBatches.push(batchNumber);
}
this.saveProgress();
// Progress bar
const percentComplete = ((i + 1) / totalBatches * 100).toFixed(1);
console.log(`Progress: ${percentComplete}% (${this.progress.totalImported} imported, ${this.progress.totalFailed} failed)`);
// Small delay between batches
if (i < totalBatches - 1) {
await new Promise(resolve => setTimeout(resolve, 100));
}
}
console.log(`
=== Import Complete ===`);
console.log(`Total users: ${this.users.length}`);
console.log(`Successfully imported: ${this.progress.totalImported}`);
console.log(`Failed: ${this.progress.totalFailed}`);
console.log(`Failed batches: ${this.progress.failedBatches.join(', ') || 'none'}`);
// Clean up progress file on success
if (this.progress.totalFailed === 0) {
fs.unlinkSync(this.progressFile);
console.log('Progress file cleaned up');
}
return this.progress;
}
}
// Usage
admin.initializeApp({
credential: admin.credential.applicationDefault(),
});
const users = JSON.parse(fs.readFileSync('users.json', 'utf8'));
const importer = new UserImporter(users);
importer.import()
.then(results => {
console.log('Final results:', results);
process.exit(results.totalFailed > 0 ? 1 : 0);
})
.catch(error => {
console.error('Import failed:', error);
process.exit(1);
});Features:
- Progress tracking with resume capability
- Automatic retry with exponential backoff
- Detailed error logging (JSONL format)
- Progress percentage display
- Failed batch tracking
Add validation to prevent the error before it occurs:
const admin = require('firebase-admin');
function validateBatchSize(users, batchSize = 1000) {
if (!Array.isArray(users)) {
throw new Error('Users must be an array');
}
if (users.length === 0) {
throw new Error('Users array is empty');
}
if (users.length > batchSize) {
throw new Error(
`Batch size (${users.length}) exceeds Firebase limit (${batchSize}). Split into multiple batches.`
);
}
return true;
}
async function safeImportUsers(users) {
const BATCH_SIZE = 1000;
// Validate entire array
if (!Array.isArray(users) || users.length === 0) {
throw new Error('Invalid users array');
}
// If within limit, import directly
if (users.length <= BATCH_SIZE) {
validateBatchSize(users, BATCH_SIZE);
return admin.auth().importUsers(users);
}
// Otherwise, batch automatically
console.log(`${users.length} users exceed limit, batching automatically...`);
const results = {
totalSuccess: 0,
totalFailure: 0,
batchResults: [],
};
for (let i = 0; i < users.length; i += BATCH_SIZE) {
const batch = users.slice(i, i + BATCH_SIZE);
// Double-check batch size (should never fail, but defensive)
if (batch.length > BATCH_SIZE) {
throw new Error(`Internal error: Batch ${i / BATCH_SIZE + 1} exceeds limit`);
}
try {
const result = await admin.auth().importUsers(batch);
results.totalSuccess += result.successCount;
results.totalFailure += result.failureCount;
results.batchResults.push({
batchNumber: i / BATCH_SIZE + 1,
successCount: result.successCount,
failureCount: result.failureCount,
});
console.log(`Batch ${i / BATCH_SIZE + 1}: ${result.successCount} succeeded`);
} catch (error) {
console.error(`Batch ${i / BATCH_SIZE + 1} failed:`, error.message);
results.totalFailure += batch.length;
}
await new Promise(resolve => setTimeout(resolve, 100));
}
return results;
}
// Example with validation
const users = loadUsersFromDatabase(); // Your user loading logic
safeImportUsers(users)
.then(results => {
console.log('Import complete:', results);
})
.catch(error => {
console.error('Import failed:', error.message);
process.exit(1);
});Validation checklist:
- ✓ Check if users is an array
- ✓ Check if users array is not empty
- ✓ Check if batch size ≤ 1,000
- ✓ Validate user object structure
- ✓ Ensure required fields are present (uid, email, etc.)
- ✓ Check for duplicate UIDs within batch
### Firebase Import Quotas Deep Dive
Per-Request vs. Per-Minute Limits:
- Per-Request: 1,000 users max per importUsers() call
- Per-Minute: 3.6M account uploads/minute per project
- Per-Second: Approximately 60,000 users/second
These limits work together:
- You can make multiple concurrent requests as long as total throughput stays under 3.6M/minute
- With 1,000 user batches, you can process ~60 batches per second
- In practice, 5-10 concurrent batches is safe and efficient
Project Quotas:
Firebase Authentication is part of the Identity Toolkit API, which has these additional limits:
- Account downloads: 21,000 requests/minute
- Bulk delete: 3,000 requests/minute
- Individual account delete: 10 accounts/second
Quota Monitoring:
- Monitor your usage in Google Cloud Console under "APIs & Services" → "Identity Toolkit API"
- Set up alerts for approaching quota limits
- Contact Firebase support to request quota increases if needed
### Import Performance Optimization
Concurrent Batch Processing:
Process multiple batches simultaneously for faster imports:
- 1 batch/second = 1,000 users/second (slow)
- 5 concurrent batches = 5,000 users/second (better)
- 10 concurrent batches = 10,000 users/second (fast)
Network Considerations:
- Use regional Firebase functions for faster latency
- Process imports from cloud environments (GCP, AWS) for better bandwidth
- Avoid importing from local machines for large datasets
Memory Management:
When importing millions of users:
- Don't load all users into memory at once
- Stream from database or file
- Process in chunks with garbage collection breaks
- Monitor heap usage to avoid OOM errors
### Error Handling Strategies
Individual User Errors:
The importUsers() API returns detailed error information for each failed user:
{
successCount: 950,
failureCount: 50,
errors: [
{
index: 23,
error: {
code: 'auth/invalid-email',
message: 'The email address is improperly formatted.'
}
},
// ... more errors
]
}Common per-user errors:
- auth/invalid-email: Malformed email address
- auth/invalid-uid: Invalid user ID format
- auth/invalid-password-hash: Password hash format error
- auth/uid-already-exists: Duplicate UID (will replace existing)
- auth/email-already-exists: Duplicate email (creates new user!)
Error Recovery:
1. Log all failed users with details
2. Fix data issues (malformed emails, invalid UIDs, etc.)
3. Retry failed users in a separate batch
4. Implement idempotency to safely retry entire batches
### Data Validation Before Import
Critical validations:
- UID uniqueness: Check for duplicate UIDs across your entire dataset
- Email format: Validate email addresses before import
- Password hashes: Ensure hashes are properly Base64-encoded
- Phone numbers: Format according to E.164 standard (+1234567890)
- Required fields: Ensure uid is present for all users
Duplicate handling:
Firebase does NOT validate unique fields during import:
- Duplicate uid → Replaces existing user (data loss!)
- Duplicate email → Creates new user (multiple accounts!)
- Duplicate phoneNumber → Creates new user
Pre-import deduplication:
function deduplicateUsers(users) {
const seenUids = new Set();
const seenEmails = new Set();
const uniqueUsers = [];
const duplicates = [];
for (const user of users) {
if (seenUids.has(user.uid)) {
duplicates.push({ reason: 'duplicate-uid', user });
continue;
}
if (user.email && seenEmails.has(user.email)) {
duplicates.push({ reason: 'duplicate-email', user });
continue;
}
seenUids.add(user.uid);
if (user.email) seenEmails.add(user.email);
uniqueUsers.push(user);
}
return { uniqueUsers, duplicates };
}### Migration from Other Auth Systems
Common migration scenarios:
1. Auth0 → Firebase: Export users with bcrypt hashes
2. AWS Cognito → Firebase: Export user pool, convert to Firebase format
3. Custom auth → Firebase: Map your schema to Firebase user structure
4. LDAP/Active Directory → Firebase: Sync users with custom scripts
Password migration:
If your existing system uses supported hashing (bcrypt, scrypt, etc.):
- Export password hashes and salts
- Include in Firebase import with proper algorithm configuration
- Users can sign in immediately without password reset
If your system uses unsupported hashing:
- Import users without passwords
- Trigger password reset emails after import
- Or implement custom authentication flow with Firebase custom tokens
### Testing Import Scripts
Test with small datasets first:
1. Test with 10 users to validate format
2. Test with 1,000 users to verify batching
3. Test with 10,000 users to check performance
4. Monitor quotas and errors at each scale
Dry run mode:
async function dryRunImport(users, batchSize = 1000) {
console.log('=== DRY RUN MODE ===');
console.log(`Total users: ${users.length}`);
console.log(`Batches required: ${Math.ceil(users.length / batchSize)}`);
// Validate data without importing
const issues = [];
users.forEach((user, index) => {
if (!user.uid) issues.push(`User ${index}: missing uid`);
if (!user.email && !user.phoneNumber) {
issues.push(`User ${index}: missing email and phone`);
}
});
if (issues.length > 0) {
console.log(`Found ${issues.length} validation issues:`);
issues.slice(0, 10).forEach(issue => console.log(` - ${issue}`));
} else {
console.log('✓ All users passed validation');
}
return { valid: issues.length === 0, issues };
}### Rollback and Recovery
If import goes wrong:
1. Stop the import: Kill the script immediately
2. Assess damage: Check how many users were imported
3. Delete imported users: Use batch delete (3,000 requests/minute limit)
4. Fix data issues: Correct validation problems
5. Resume import: Use progress tracking to resume from last successful batch
Batch delete for rollback:
async function deleteImportedUsers(uids) {
const BATCH_SIZE = 1000; // Same as import limit
for (let i = 0; i < uids.length; i += BATCH_SIZE) {
const batch = uids.slice(i, i + BATCH_SIZE);
try {
const result = await admin.auth().deleteUsers(batch);
console.log(`Deleted ${result.successCount} users, ${result.failureCount} failed`);
} catch (error) {
console.error('Delete batch failed:', error.message);
}
}
}messaging/UNSPECIFIED_ERROR: No additional information available
How to fix "messaging/UNSPECIFIED_ERROR: No additional information available" in Firebase Cloud Messaging
App Check: reCAPTCHA Score Too Low
App Check reCAPTCHA Score Too Low
storage/invalid-url: Invalid URL format for Cloud Storage reference
How to fix invalid URL format in Firebase Cloud Storage
auth/missing-uid: User ID identifier required
How to fix "auth/missing-uid: User ID identifier required" in Firebase
auth/invalid-argument: Invalid parameter passed to method
How to fix "auth/invalid-argument: Invalid parameter passed to method" in Firebase