Git exhausts available memory when creating, compressing, or processing pack files during clone, fetch, push, or repack operations. This typically occurs with large repositories or on systems with limited RAM.
This error occurs when Git's pack-objects process attempts to allocate more memory than is available on your system. Git uses pack files to efficiently store and transfer repository data by compressing objects together. During operations like clone, fetch, push, repack, or garbage collection, Git needs to load objects into memory, calculate deltas (differences between similar objects), and compress them into pack files. The memory exhaustion happens because Git tries to optimize compression by keeping a "window" of objects in memory simultaneously to find the best delta compression opportunities. With large repositories, many threads, or aggressive compression settings, this memory requirement can exceed what's available, causing malloc (memory allocation) to fail. This is particularly common on 32-bit systems (limited to ~3GB address space), Windows environments, systems with limited RAM, or when working with repositories containing very large files or extensive history.
Set conservative memory limits for pack operations to prevent exhaustion:
git config --global pack.windowMemory "100m"
git config --global pack.packSizeLimit "100m"
git config --global pack.threads 1
git config --global core.packedGitLimit "128m"
git config --global core.packedGitWindowSize "128m"
git config --global pack.deltaCacheSize "128m"These settings limit memory per operation. Adjust values based on available RAM (lower for <4GB systems).
If cloning, limit history depth to reduce memory requirements:
git clone --depth 1 https://github.com/user/repo.gitOr specify a depth that includes the commits you need:
git clone --depth 50 https://github.com/user/repo.gitLater, if needed, fetch full history:
git fetch --unshallowIf the error occurs during repack or gc operations, use memory-limited flags:
git repack -a -d --depth=20 --window=200 --window-memory=100m --max-pack-size=100mOr even more conservative:
git repack -a -d --depth=10 --window=50 --window-memory=50m --max-pack-size=50mAdjust --window-memory based on available RAM.
If pushing fails with memory errors, increase the HTTP post buffer:
git config --global http.postBuffer 524288000This sets a 500MB buffer. Adjust as needed.
If pushing many commits causes memory exhaustion, push in smaller batches:
# Push commits up to a specific commit
git push origin abc123:main
# Or push branches one at a time
git push origin feature-branchFor large pushes, consider splitting into multiple smaller pushes.
On Windows or 32-bit Linux systems, install 64-bit Git:
Windows:
Download from [git-scm.com](https://git-scm.com/download/win) and install the 64-bit version.
Linux:
# Check current architecture
git --version
file $(which git)
# Install 64-bit version if on 64-bit OS
sudo apt-get install git # Debian/Ubuntu
sudo yum install git # RHEL/CentOS64-bit Git can address much more memory, eliminating many memory exhaustion issues.
If physical RAM is limited, configure swap space:
# Check current swap
free -h
# Create 4GB swap file
sudo fallocate -l 4G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
# Make permanent (add to /etc/fstab)
echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstabSwap allows Git to use disk space when RAM is exhausted, preventing crashes.
For a specific problematic repository, set limits in .git/config:
cd /path/to/repo
git config pack.windowMemory 10m
git config pack.packSizeLimit 20m
git config pack.threads 1This overrides global settings for just this repository.
Memory calculation formula: Git's actual memory usage during repack is approximately: (pack.deltaCacheSize + pack.windowMemory) × pack.threads. Default values are 256MB, unlimited, and number of CPU cores respectively, which can easily exceed available RAM on constrained systems.
Why fetch fails when clone succeeds: Git clone typically uses regular pack files that the server already has prepared, which is memory-efficient. Git fetch, however, tries to send only what's missing by creating "thin" packs on-the-fly, which requires the server to load objects into memory, calculate deltas, and compress—a much more memory-intensive operation. This explains why clones work but subsequent fetches fail on the same server.
32-bit limitations: Even with 8GB+ of physical RAM, 32-bit processes can only address roughly 2-3GB due to address space limitations. This is why 32-bit Git on Windows frequently fails with large repositories regardless of how much RAM is installed.
Delta compression trade-offs: While disabling delta compression (setting pack.window to 0) eliminates memory issues, it dramatically increases repository size on disk and network transfer times. Only use this as a last resort.
Server-side considerations: If you control the Git server experiencing upload-pack memory exhaustion, consider configuring pack.windowMemory and pack.packSizeLimit in the server's Git configuration, or adding more RAM/swap to the server itself.
Large file handling: Repositories with files over 100MB should consider using Git LFS (Large File Storage) instead of tracking large binaries directly in Git, as this prevents them from being loaded into memory during pack operations.
fatal: bad object in rev-list input
Git rev-list encounters bad or invalid object
kex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server
fatal: unable to access: Proxy auto-configuration failed
How to fix 'Proxy auto-configuration failed' in Git
fatal: unable to access: Authentication failed (proxy requires basic auth)
How to fix 'Authentication failed (proxy requires basic auth)' in Git
fatal: unable to access: no_proxy configuration not working
How to fix 'no_proxy configuration not working' in Git