GitHub rejects pushes containing files larger than 100 MB. To fix this, you must remove the large file from your repository history using BFG Repo-Cleaner or git filter-repo, then either exclude the file or use Git LFS (Large File Storage) for tracking large files.
This error occurs when you attempt to push a commit to GitHub that contains a file exceeding 100 MB. GitHub enforces a strict file size limit of 100 MB for individual files in repositories. The important thing to understand is that even if you delete the large file from your working directory, Git still stores it in your commit history. Every commit that ever contained the file needs to be rewritten to truly remove it. This is why simply deleting the file and committing the deletion doesn't fix the problem. GitHub's 100 MB limit exists to ensure repository performance and prevent abuse. Large binary files don't version well in Git anyway - each change creates a complete copy rather than a diff. For files over 100 MB, GitHub recommends using Git LFS, which stores large files on external storage while keeping lightweight pointer files in your repository. Note: GitHub also displays a warning for files larger than 50 MB, though these can still be pushed. The hard limit is 100 MB, and pushing files larger than this will be completely blocked.
First, identify exactly which files are too large and where they appear in your history:
# Find all files larger than 100MB in your repository history
git rev-list --objects --all | \
git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' | \
awk '/^blob/ {print substr($0,6)}' | \
sort -rnk2 | \
head -20
# Or use git-sizer for a comprehensive report
# Install: brew install git-sizer (macOS) or download from GitHub
git-sizer --verboseFind which commits contain the large file:
# Find commits that modified the large file
git log --all --full-history -- "path/to/large_file.zip"
# See the size of a specific file
git ls-tree -r -l HEAD | grep large_fileTake note of the file path - you'll need it for the removal steps.
BFG Repo-Cleaner is 10-720x faster than git filter-branch and much simpler to use. It's the recommended tool for removing large files.
Step 1: Install BFG
# macOS
brew install bfg
# Or download the JAR file from https://rtyley.github.io/bfg-repo-cleaner/
# Requires Java 11 or higherStep 2: Create a fresh clone (mirror)
# Clone a bare repository (required for BFG)
git clone --mirror https://github.com/username/repo.git repo.git
cd repo.gitStep 3: Run BFG to remove large files
# Remove all files larger than 100MB
bfg --strip-blobs-bigger-than 100M
# Or remove a specific file by name
bfg --delete-files large_file.zip
# Or remove files matching a pattern
bfg --delete-files "*.zip"Step 4: Clean up and push
# Clean up the repository
git reflog expire --expire=now --all
git gc --prune=now --aggressive
# Push the cleaned history (requires force push)
git push --forceImportant: BFG protects your current HEAD commit by default. If the large file is in your latest commit, first delete it and commit the deletion, then run BFG.
git filter-repo is the modern, officially recommended replacement for git filter-branch. It's fast and handles edge cases well.
Install git filter-repo:
# macOS
brew install git-filter-repo
# pip (any platform)
pip install git-filter-repo
# Ubuntu/Debian
apt install git-filter-repoRemove the large file:
# Remove a specific file from all history
git filter-repo --invert-paths --path "path/to/large_file.zip"
# Remove multiple files
git filter-repo --invert-paths --path "file1.zip" --path "file2.tar.gz"
# Remove files matching a pattern
git filter-repo --invert-paths --path-glob "*.zip"
# Remove all files larger than 100MB
git filter-repo --strip-blobs-bigger-than 100MAfter filtering, push the changes:
# Re-add the remote (filter-repo removes it for safety)
git remote add origin https://github.com/username/repo.git
# Force push all branches
git push origin --force --all
git push origin --force --tagsIf the large file was added in the last few commits and hasn't been pushed yet, interactive rebase is simpler:
For a file added in the last commit:
# Undo the last commit, keeping changes staged
git reset --soft HEAD~1
# Remove the large file from staging
git rm --cached path/to/large_file.zip
# Add to .gitignore
echo "large_file.zip" >> .gitignore
# Re-commit without the large file
git add .gitignore
git commit -m "Your original commit message"For a file added a few commits ago:
# Start interactive rebase (replace N with number of commits)
git rebase -i HEAD~N
# In the editor, change 'pick' to 'edit' for the commit that added the file
# Save and close
# Remove the large file
git rm --cached path/to/large_file.zip
git commit --amend --no-edit
# Continue the rebase
git rebase --continueWarning: Don't use interactive rebase if you've already pushed these commits to a shared branch - use BFG or filter-repo instead.
If you need to version control large files, use Git Large File Storage (LFS). It stores large files on external storage while keeping pointer files in your repo.
Step 1: Install Git LFS
# macOS
brew install git-lfs
# Windows (comes with Git for Windows, or)
git lfs install
# Ubuntu/Debian
apt install git-lfs
# Initialize LFS for your user account (once per machine)
git lfs installStep 2: Track large file patterns
# Track specific file types
git lfs track "*.zip"
git lfs track "*.psd"
git lfs track "*.mp4"
# Track a specific file
git lfs track "path/to/large_file.zip"
# This creates/updates .gitattributesStep 3: Commit the .gitattributes file
git add .gitattributes
git commit -m "Configure Git LFS tracking"Step 4: Migrate existing large files to LFS
# Import existing files into LFS (rewrites history)
git lfs migrate import --include="*.zip" --everything
# Force push the rewritten history
git push origin --force --allImportant: GitHub offers 1 GB free LFS storage and 1 GB bandwidth per month. Additional storage is available for purchase.
Set up safeguards to prevent accidentally committing large files again:
Update .gitignore:
# Add common large file patterns to .gitignore
cat >> .gitignore << 'EOF'
# Large files
*.zip
*.tar.gz
*.rar
*.7z
*.mp4
*.mov
*.psd
*.ai
# Build artifacts
dist/
build/
*.exe
*.dll
*.so
*.dylib
# Dependencies
node_modules/
vendor/
# Data files
*.csv
*.sql
*.sqlite
data/
EOF
git add .gitignore
git commit -m "Update .gitignore to exclude large files"Set up a pre-commit hook:
# Create a pre-commit hook
cat > .git/hooks/pre-commit << 'EOF'
#!/bin/bash
# Prevent commits of files larger than 50MB
MAX_SIZE=$((50 * 1024 * 1024)) # 50MB in bytes
for file in $(git diff --cached --name-only); do
if [ -f "$file" ]; then
size=$(wc -c < "$file")
if [ $size -gt $MAX_SIZE ]; then
echo "Error: $file is larger than 50MB"
echo "Consider using Git LFS for large files"
exit 1
fi
fi
done
EOF
chmod +x .git/hooks/pre-commitUse pre-commit framework for team-wide enforcement:
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: check-added-large-files
args: ['--maxkb=50000'] # 50MBAfter removing the large file from history, you need to force push to update the remote:
# Verify the large file is removed
git rev-list --objects --all | \
git cat-file --batch-check='%(objecttype) %(objectname) %(objectsize) %(rest)' | \
awk '/^blob/ && $3 > 100000000 {print}'
# Should return empty if all large files are removed
# Force push to GitHub
git push origin main --force
# Push all branches if needed
git push origin --force --all
# Push tags
git push origin --force --tagsImportant warnings about force pushing:
- Force pushing rewrites history - all collaborators must re-clone or rebase
- Never force push to a shared branch without coordinating with your team
- Consider using --force-with-lease for safety:
# Safer force push - fails if remote has new commits
git push origin main --force-with-leaseAfter force pushing, inform your team:
# Team members should run:
git fetch origin
git reset --hard origin/main
# Or re-clone the repository
git clone https://github.com/username/repo.git### Why Deleting the File Doesn't Work
Git is a content-addressable filesystem - every version of every file ever committed is stored forever (until garbage collected). When you delete a file and commit, you're just adding a new commit that doesn't include that file. All previous commits still contain it.
This is why you need history-rewriting tools like BFG or git filter-repo. They create new commits with the same content minus the large file, then update all branch pointers to the new history.
### GitHub vs GitLab vs Bitbucket Limits
| Platform | File Size Limit | Push Size Limit | LFS Support |
|----------|-----------------|-----------------|-------------|
| GitHub | 100 MB | 2 GB | Yes (1 GB free) |
| GitLab | 100 MB* | None | Yes (10 GB free) |
| Bitbucket | 100 MB | 2 GB | Yes (1 GB free) |
*GitLab limits can be configured by admins for self-hosted instances.
### Git LFS Costs and Alternatives
GitHub LFS pricing:
- Free: 1 GB storage, 1 GB bandwidth/month
- Data packs: $5/month for 50 GB storage + 50 GB bandwidth
Alternatives to Git LFS:
- git-annex: More flexible, good for scientific data
- DVC (Data Version Control): Designed for ML/data science
- External storage: Store in S3/GCS and reference in README
### Recovering from a Failed Push
If you pushed some commits before hitting the limit:
# Find what commits made it to the remote
git log origin/main..HEAD
# You may need to coordinate with your team
# before rewriting shared history### Working with Protected Branches
If the target branch is protected and doesn't allow force pushes:
1. Temporarily disable branch protection (Settings > Branches)
2. Force push the cleaned history
3. Re-enable branch protection
Or create a new branch, cherry-pick commits without the large file, and merge via PR.
### Large File Detection Before Commit
Use git hooks or CI checks to catch large files early:
# GitHub Actions workflow to check file sizes
name: Check File Sizes
on: [push, pull_request]
jobs:
check:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for large files
run: |
find . -type f -size +50M -not -path "./.git/*" | while read f; do
echo "::error file=$f::File exceeds 50MB"
exit 1
done### BFG vs git filter-repo vs git filter-branch
| Tool | Speed | Ease of Use | Maintenance |
|------|-------|-------------|-------------|
| BFG | Very fast | Very easy | Stable |
| git filter-repo | Fast | Easy | Active |
| git filter-branch | Slow | Complex | Deprecated |
Recommendation: Use BFG for simple file removal, git filter-repo for complex rewrites.
### Shallow Clones and Large Files
If you only need recent history, shallow clones can avoid downloading old large files:
# Clone only the last 100 commits
git clone --depth 100 https://github.com/username/repo.git
# Clone only a specific branch
git clone --single-branch --branch main https://github.com/username/repo.gitThis won't help if the large file is in recent commits, but can speed up cloning repositories where large files were removed from history long ago.
kex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server
fatal: unable to access: Proxy auto-configuration failed
How to fix 'Proxy auto-configuration failed' in Git
fatal: unable to access: Authentication failed (proxy requires basic auth)
How to fix 'Authentication failed (proxy requires basic auth)' in Git
fatal: unable to access: no_proxy configuration not working
How to fix 'no_proxy configuration not working' in Git
fatal: unable to read tree object in treeless clone
How to fix 'unable to read tree object in treeless clone' in Git