Git fails to compress objects during pack generation, typically when pushing large repositories or running garbage collection. This memory-intensive operation can fail due to insufficient resources or corrupt objects.
Delta compression is Git's method of storing objects efficiently by computing differences (deltas) between similar files instead of storing complete copies. When this compression fails, it usually indicates one of three problems: insufficient memory for the compression algorithm, corrupt repository objects preventing delta calculation, or system resource limits that prevent Git from completing the operation. The delta compression process is especially memory-intensive during push operations and garbage collection (`git gc`) because Git must hold multiple objects in memory simultaneously to calculate optimal deltas. The memory requirements scale with repository size and are multiplied by the number of threads configured for pack operations. This error often appears as warnings during normal operations but can escalate to fatal errors that prevent pushing to remote repositories or completing repository maintenance tasks. Understanding the underlying resource constraints is key to resolving these issues.
Before adjusting memory settings, verify your repository isn't corrupted:
git fsck --full --no-danglingIf this reports errors, you may need to repair the repository before proceeding. A clean fsck output indicates the issue is resource-related rather than corruption.
Configure Git to use less memory during delta compression:
git config --global pack.windowMemory "256m"
git config --global pack.packSizeLimit "256m"
git config --global pack.threads 1
git config --global pack.deltaCacheSize "128m"These settings limit memory consumption by:
- pack.windowMemory: Caps memory per thread to 256MB
- pack.threads: Reduces to single-threaded operation
- pack.deltaCacheSize: Limits delta cache to 128MB
For local repositories only, use git config without --global.
Manually repack the repository with controlled memory usage:
git gc --aggressive --prune=nowIf this still fails, try a more conservative repack:
git repack -a -d -f --depth=50 --window=50The lower --depth and --window values reduce memory requirements at the cost of slightly less optimal compression.
If the error occurs during push operations, increase the post buffer:
git config --global http.postBuffer 524288000This sets the buffer to 500MB, helping with large pack transfers over HTTP. Adjust the value based on your typical commit sizes.
For repositories with many unpushed commits, push branches individually:
# Push current branch only
git push origin HEAD
# Or push specific branches
git push origin feature-branch-1
git push origin feature-branch-2Avoid git push --all when experiencing compression issues.
If specific file types cause compression issues, disable delta compression in .gitattributes:
# Add to .gitattributes in repository root
*.zip -delta
*.jar -delta
*.png -delta
*.jpg -delta
*.mp4 -deltaThis prevents Git from attempting delta compression on binary files that don't compress well. Commit this file and run git gc again.
Understanding Git's memory calculation: Total memory usage during pack operations is roughly (pack.deltaCacheSize + pack.windowMemory) × pack.threads. On 64-bit systems, Git defaults to much higher values that assume abundant memory. If running in constrained environments (Docker containers, CI runners, VMs), you must explicitly configure these limits.
Threading trade-offs: While multiple threads speed up compression, they multiply memory requirements and split the working set, potentially reducing delta quality. For memory-constrained systems, pack.threads=1 with higher pack.windowMemory often performs better than multiple threads with lower memory per thread.
Kernel limits: On Linux systems, check /etc/sysctl.conf for shared memory limits that might prevent Git from using available system memory. The kernel may kill Git processes that exceed configured limits even when physical memory is available.
Large file considerations: Objects larger than core.bigFileThreshold (default 512MB) skip delta compression automatically. If you regularly commit files near this threshold, consider lowering it to 256m or 128m to avoid compression attempts on files unlikely to delta well.
Remote repository errors: If the remote reports "unresolved deltas" or "bad object" errors even though local operations succeed, the issue may be network-related. Use git push --verbose to identify where the transfer fails and consider using SSH instead of HTTPS for more reliable large transfers.
kex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server
fatal: unable to access: Proxy auto-configuration failed
How to fix 'Proxy auto-configuration failed' in Git
fatal: unable to access: Authentication failed (proxy requires basic auth)
How to fix 'Authentication failed (proxy requires basic auth)' in Git
fatal: unable to access: no_proxy configuration not working
How to fix 'no_proxy configuration not working' in Git
fatal: unable to read tree object in treeless clone
How to fix 'unable to read tree object in treeless clone' in Git