Git push failed because the data being sent exceeds the server's maximum allowed request size. This typically happens when pushing large files, many commits at once, or repositories with substantial history over HTTP/HTTPS.
When you see "error: RPC failed; HTTP 413 Request Entity Too Large", Git is telling you that the HTTP server (GitHub, GitLab, Bitbucket, or your self-hosted Git server) rejected your push because the request body exceeded its configured size limit. Git uses HTTP POST requests for push operations over HTTPS. When you push commits, Git packs the objects (commits, trees, blobs) and sends them to the server. If this pack exceeds the server's `client_max_body_size` (nginx), `LimitRequestBody` (Apache), or equivalent setting, the server responds with HTTP 413. This error is distinct from GitHub's file size limits - it's about the total request payload, not individual files. You might hit this limit when: - Pushing a repository with many commits for the first time - Pushing large binary files - Pushing after a long period without syncing - Using a self-hosted Git server with restrictive defaults
Git has a client-side buffer that controls how much data it sends in one HTTP request. Increase it to allow larger pushes:
# Increase to 500 MB (should handle most cases)
git config --global http.postBuffer 524288000
# For very large repositories, try 1 GB
git config --global http.postBuffer 1048576000This tells Git to buffer more data before sending, which can help with chunked transfer issues. However, this alone won't fix server-side limits - it just optimizes how Git sends the data.
Note: Setting this too high can cause memory issues on systems with limited RAM.
If you have many commits to push, break them into smaller batches:
# Check how many commits need pushing
git log origin/main..HEAD --oneline | wc -l
# Push commits in batches (e.g., 50 at a time)
git log origin/main..HEAD --oneline --reverse | head -50 | tail -1 | awk '{print $1}' | xargs git push origin
# Or manually push up to a specific commit
git push origin <commit-sha>:mainSimpler approach - push incrementally:
# Push first 100 commits
git rev-list --reverse origin/main..HEAD | head -100 | tail -1 | xargs -I {} git push origin {}:main
# Repeat until all commits are pushed
git push origin mainIf your repository contains large binary files, migrate them to Git Large File Storage:
# Install Git LFS
git lfs install
# Track large file types
git lfs track "*.psd"
git lfs track "*.zip"
git lfs track "*.mp4"
git lfs track "*.bin"
# Add the .gitattributes file
git add .gitattributes
git commit -m "Configure Git LFS tracking"
# Migrate existing large files (rewrites history!)
git lfs migrate import --include="*.psd,*.zip" --everythingWarning: git lfs migrate import rewrites Git history. Coordinate with your team before running this on shared branches.
Git LFS stores large files separately, so your push payload only contains pointers instead of the actual file content.
SSH connections don't have the same HTTP body size limitations:
# Check your current remote URL
git remote -v
# Change from HTTPS to SSH
git remote set-url origin [email protected]:username/repo.git
# For GitLab
git remote set-url origin [email protected]:username/repo.git
# For Bitbucket
git remote set-url origin [email protected]:username/repo.gitEnsure you have SSH keys configured:
# Generate SSH key if needed
ssh-keygen -t ed25519 -C "[email protected]"
# Add to ssh-agent
eval "$(ssh-agent -s)"
ssh-add ~/.ssh/id_ed25519
# Copy public key to add to GitHub/GitLab
cat ~/.ssh/id_ed25519.pubIf you control the Git server, increase the HTTP body size limits:
Nginx (commonly used with GitLab, Gitea):
# In /etc/nginx/nginx.conf or site config
http {
client_max_body_size 500M;
}
# Or in a location block
location / {
client_max_body_size 500M;
}Apache:
# In httpd.conf or .htaccess
LimitRequestBody 524288000GitLab (Omnibus):
# In /etc/gitlab/gitlab.rb
nginx['client_max_body_size'] = '500m'Then run sudo gitlab-ctl reconfigure
Gitea:
# In app.ini
[server]
LFS_MAX_FILE_SIZE = 500000000For initial repository imports with extensive history, consider a shallow approach:
# Clone with limited history
git clone --depth 1 https://source-repo.git temp-repo
cd temp-repo
# Add new remote
git remote add new-origin [email protected]:username/new-repo.git
# Push (smaller payload without full history)
git push new-origin mainIf you need the full history but it's too large:
# Fetch history incrementally
git fetch --unshallow
# Or fetch specific depth
git fetch --depth=100
git fetch --depth=500
git fetch --depth=1000
# Continue until full history is fetchedCorporate networks often have proxies that limit request sizes:
# Check if Git is using a proxy
git config --get http.proxy
git config --get https.proxy
# Check environment variables
echo $HTTP_PROXY
echo $HTTPS_PROXYIf you're behind a corporate proxy:
1. Contact IT to increase the proxy's request size limit
2. Use SSH instead (often not proxied)
3. Request proxy bypass for your Git hosting provider:
# Bypass proxy for specific hosts
git config --global http.https://github.com.proxy ""
git config --global http.https://gitlab.com.proxy ""4. Use Git over port 443 (often allowed through firewalls):
# GitHub supports SSH over HTTPS port
git remote set-url origin ssh://[email protected]:443/username/repo.gitIf large files were accidentally committed, remove them from history:
Using git-filter-repo (recommended):
# Install git-filter-repo
pip install git-filter-repo
# Remove specific large files from all history
git filter-repo --path large-file.zip --invert-paths
# Remove files over a certain size (e.g., 50MB)
git filter-repo --strip-blobs-bigger-than 50MUsing BFG Repo Cleaner:
# Download BFG
wget https://repo1.maven.org/maven2/com/madgag/bfg/1.14.0/bfg-1.14.0.jar
# Remove files larger than 100MB
java -jar bfg-1.14.0.jar --strip-blobs-bigger-than 100M
# Clean up
git reflog expire --expire=now --all
git gc --prune=now --aggressiveWarning: Both methods rewrite Git history. Force push will be required, and collaborators must re-clone.
Understanding the error chain: The "RPC failed" message indicates Git's HTTP-based remote procedure call failed. The underlying HTTP 413 is returned by a web server (nginx, Apache, etc.) before Git's server-side code even sees the request.
Diagnosing the exact limit:
# See verbose HTTP output
GIT_CURL_VERBOSE=1 git push origin main 2>&1 | grep -i "content-length\|413"
# Check approximate pack size before pushing
git count-objects -vHMultiple layers of limits: A push might pass through several servers, each with its own limit:
1. Your corporate proxy
2. CDN (Cloudflare, Fastly)
3. Load balancer
4. Web server (nginx/Apache)
5. Git server application
GitHub-specific notes: GitHub has a 2GB per-push limit via HTTPS. For repositories approaching this, use SSH or push in batches. The http.postBuffer setting helps but won't override server limits.
GitLab-specific notes: Self-hosted GitLab often has nginx configured with a 250MB default. Check /var/opt/gitlab/nginx/conf/gitlab-http.conf for the actual client_max_body_size value.
Alternative: Use Git bundle for very large transfers:
# Create a bundle file
git bundle create repo.bundle --all
# Transfer the bundle file separately (scp, cloud storage, etc.)
# On the receiving end
git clone repo.bundle repo-name
git remote set-url origin <actual-remote-url>Compression can help:
# Increase compression (slower but smaller payload)
git config --global pack.compression 9
git repack -a -dkex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server
fatal: unable to access: Proxy auto-configuration failed
How to fix 'Proxy auto-configuration failed' in Git
fatal: unable to access: Authentication failed (proxy requires basic auth)
How to fix 'Authentication failed (proxy requires basic auth)' in Git
fatal: unable to access: no_proxy configuration not working
How to fix 'no_proxy configuration not working' in Git
fatal: unable to read tree object in treeless clone
How to fix 'unable to read tree object in treeless clone' in Git