This error occurs when your Firebase project has reached its storage quota limit and cannot accept new uploads or file operations. Depending on your billing plan, the quota is either 5GB (Spark/free tier) or pay-as-you-go (Blaze plan). You must either delete existing files, upgrade your plan, or request a quota increase.
The "storage/quota-exceeded" error is Firebase's way of preventing operations that would exceed your allocated storage quota. Firebase projects have different storage limits based on your billing plan: **Spark Plan (Free)**: 5GB total storage, 50,000 downloads/day, 20,000 uploads/day. **Blaze Plan (Pay-as-you-go)**: Storage and bandwidth charges apply, with higher operational limits. When you hit the quota, Firebase blocks all new uploads, large downloads, and file operations until you either reduce your storage usage, upgrade your plan, or increase your quota. This is a hard limit that prevents your project from consuming excessive resources unexpectedly. It's important to note that quota limits differ across Firebase products—Cloud Storage, Cloud Functions, Firestore, and Hosting all have separate quotas. This error specifically refers to Cloud Storage space limits.
Navigate to the [Firebase Console](https://console.firebase.google.com/), select your project, and click on Storage in the left sidebar.
You'll see:
- Current usage: Total bytes used vs. quota available
- Breakdown by file: Size of each object in your bucket
- Billing plan: Whether you're on Spark (free) or Blaze (paid)
If you're on Spark plan and using close to 5GB, that's your issue. If on Blaze and still exceeding limits, check the Quotas page.
The quickest fix is to delete old, unused, or test data from your bucket:
In Firebase Console:
1. Go to Storage → Your bucket
2. Browse folders and identify large or unnecessary files
3. Select files and click the delete icon (trash bin)
4. Confirm deletion
Programmatically (Web SDK):
import { getStorage, ref, deleteObject } from "firebase/storage";
const storage = getStorage();
const fileRef = ref(storage, 'path/to/file.txt');
try {
await deleteObject(fileRef);
console.log('File deleted successfully');
} catch (error) {
console.error('Failed to delete file:', error);
}Programmatically (Admin SDK):
import * as admin from 'firebase-admin';
const bucket = admin.storage().bucket();
// Delete single file
await bucket.file('path/to/file.txt').delete();
// Delete directory recursively
await bucket.deleteFiles({ prefix: 'path/to/folder/' });Target files to delete first:
- Temporary test uploads
- Old backup files
- Build artifacts and logs
- Duplicate or superseded files
- Cache files that can be regenerated
To prevent quota issues in the future, configure Cloud Storage lifecycle rules to automatically delete old files:
In Firebase Console:
1. Go to Storage → Your bucket
2. Click the Lifecycle tab
3. Click Add a rule
4. Configure:
- Action: Delete object
- Condition: Age in days (e.g., 90)
- Optional: Add storage class condition
5. Click Create
Programmatically (gcloud CLI):
# Create lifecycle.json
cat > lifecycle.json <<EOF
{
"lifecycle": {
"rule": [
{
"action": {"type": "Delete"},
"condition": {"age": 90}
}
]
}
EOF
# Apply to bucket
gsutil lifecycle set lifecycle.json gs://your-bucket-nameLifecycle rules automatically:
- Delete files older than specified days
- Move files to cheaper storage classes
- Clean up temporary uploads that weren't manually deleted
Recommended rules:
- Delete temp/cache files after 7-30 days
- Delete test uploads older than 90 days
- Move infrequently accessed data to Nearline/Coldline storage
If you need more than 5GB and can't delete files, upgrade to the Blaze (pay-as-you-go) plan:
1. Go to [Firebase Console](https://console.firebase.google.com/) → Project Settings (gear icon)
2. Click the Billing tab
3. Select Upgrade to Blaze plan (or link an existing Google Cloud billing account)
4. Choose your payment method
5. Click Upgrade
Important: Blaze plan charges for storage and bandwidth:
- Storage: $0.18 per GB/month
- Bandwidth: Varies by region, typically $0.12-0.23 per GB
Cost estimates:
- 100 GB storage: ~$18/month
- 1 TB storage: ~$180/month
- Add bandwidth costs for downloads
Tip: Use the [GCP Pricing Calculator](https://cloud.google.com/products/calculator) to estimate costs before upgrading.
If you're on Blaze plan but still hitting limits, you can request a quota increase:
1. Go to Google Cloud Console → IAM & Admin → Quotas
2. Search for "Cloud Storage"
3. Select the quota you want to increase:
- Per-bucket storage limit
- API request rate
- Other relevant quotas
4. Click the checkbox next to the quota
5. Click Edit Quotas at the top
6. Enter your desired limit
7. Click Done and submit for review
Google typically approves quota increase requests within 24-48 hours.
Common quota increase scenarios:
- Per-bucket limit: Default is no hard limit, but very large buckets (>10TB) may need approval
- Storage transfer quota: If uploading/downloading >1TB/day
- API operations: If making >1M operations/day
Prevent quota issues by validating uploads before sending to Firebase:
File size validation:
const MAX_FILE_SIZE = 100 * 1024 * 1024; // 100MB
const MAX_TOTAL_SIZE = 4 * 1024 * 1024 * 1024; // 4GB
async function uploadFile(file) {
if (file.size > MAX_FILE_SIZE) {
throw new Error(`File too large: ${file.size / 1024 / 1024}MB exceeds limit`);
}
// Check remaining quota
const metadata = await getMetadata();
const remaining = MAX_TOTAL_SIZE - metadata.usedBytes;
if (file.size > remaining) {
throw new Error(`Insufficient quota: ${remaining} bytes remaining`);
}
// Safe to upload
await uploadToFirebase(file);
}Compress media before upload:
// Example: Compress images before storing
import Compressor from 'compressorjs';
const compressImage = (file) => {
return new Promise((resolve, reject) => {
new Compressor(file, {
quality: 0.8,
maxWidth: 1920,
maxHeight: 1080,
success: resolve,
error: reject
});
});
};
const compressed = await compressImage(imageFile);
await uploadToFirebase(compressed);Implement expiry for temporary files:
// Mark files with TTL metadata
const metadata = {
customMetadata: {
expiresAt: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000).toISOString(),
purpose: 'temporary'
}
};
await uploadBytes(fileRef, data, metadata);Cloud Functions Build Artifacts: The [region].artifacts.[project-name].appspot.com bucket is used for Cloud Functions build images and can grow large when deploying Cloud Functions repeatedly during development. If quotas are exceeded:
# View size of artifacts bucket
gsutil du -s gs://[region].artifacts.[project-name].appspot.com
# Delete old build artifacts (keep last 5)
gsutil ls -l gs://[region].artifacts.[project-name].appspot.com | tail -n +6 | awk '{print $NF}' | head -n -5 | xargs -I {} gsutil rm {}Multi-region considerations: If using Cloud Storage across multiple regions or buckets, quotas apply per-bucket in Spark plan but globally in Blaze plan. Monitor each bucket's usage separately if using multiple buckets for different purposes.
Integration with Firestore: Firestore has separate quotas from Cloud Storage. If storing large binary data, use Cloud Storage references in Firestore documents rather than embedding binary data directly—this is both cheaper and avoids hitting document size limits.
Monitoring and alerts: Set up budget alerts in Google Cloud Console to get notified before hitting expensive quota thresholds:
# Create budget alert
gcloud billing budgets create \
--billing-account=BILLING_ACCOUNT_ID \
--display-name="Storage Budget" \
--budget-amount=100 \
--threshold-rule=percent=50 \
--threshold-rule=percent=100Downgrading after testing: If you upgraded to Blaze plan temporarily for testing, remember to downgrade back to Spark after freeing up space. However, you cannot downgrade if you've already exceeded Spark plan limits—you'll need to delete files until usage is below 5GB first.
Spark plan bandwidth limits: The 1GB/day bandwidth limit on Spark plan is often exceeded unintentionally. A single high-resolution video (100MB) downloaded 10 times hits the limit. If bandwidth quota is your limiting factor, upgrade to Blaze—the bandwidth cost is often cheaper than time spent optimizing.
Callable Functions: INTERNAL - Unhandled exception
How to fix "Callable Functions: INTERNAL - Unhandled exception" in Firebase
auth/invalid-hash-algorithm: Hash algorithm doesn't match supported options
How to fix "auth/invalid-hash-algorithm: Hash algorithm doesn't match supported options" in Firebase
Hosting: CORS configuration not set up properly
How to fix CORS configuration in Firebase Hosting
auth/reserved-claims: Custom claims use reserved OIDC claim names
How to fix "reserved claims" error when setting custom claims in Firebase
Callable Functions: UNAUTHENTICATED - Invalid credentials
How to fix "UNAUTHENTICATED - Invalid credentials" in Firebase Callable Functions