Git warns about excessive loose objects when your repository has accumulated too many unpacked object files, typically over 6,700. This triggers automatic garbage collection to optimize storage and performance.
This warning appears when Git detects an unusually high number of "loose objects" in your repository. Loose objects are individual files stored in the .git/objects directory that haven't been compressed into packfiles yet. Each commit, tree, and blob you create initially starts as a loose object. Git monitors the .git/objects/17/ subdirectory as a sample, and when it estimates there are more than 6,700 loose objects total (approximately 27-28 objects in the /17/ directory), it triggers this warning. The high count typically indicates that automatic garbage collection hasn't run recently, or that you've been performing many operations that create objects faster than Git can clean them up. While the warning itself is informative rather than critical, it signals that your repository's storage is inefficient and performance may degrade. Git will automatically attempt to clean up by running garbage collection, but in some cases manual intervention helps ensure optimal repository health.
First, verify how many loose objects you actually have:
# Count loose objects across all subdirectories
find .git/objects -type f | wc -l
# Check objects in the sample directory Git monitors
ls -1 .git/objects/17/ | wc -lIf you see thousands of files, the warning is justified.
The simplest fix is to run Git's garbage collection manually:
# Standard garbage collection
git gc
# Or with automatic settings (may be faster)
git gc --autoThis packs loose objects into packfiles and removes unreachable objects older than 2 weeks.
If you want to remove unreachable objects regardless of age:
# Prune all unreachable loose objects now
git prune
# Or combine with gc
git gc --prune=nowWarning: Only use --prune=now if you're certain no other processes are accessing the repository, as it can cause corruption in concurrent access scenarios.
After cleanup, check that your repository is healthy:
# Full repository integrity check
git fsck --full
# Count remaining loose objects
find .git/objects -type f | wc -lYou should see significantly fewer loose objects and no corruption errors.
Ensure automatic garbage collection is properly configured:
# Check current gc.auto setting (should be 6700 by default)
git config --get gc.auto
# If it's 0 (disabled), re-enable it
git config gc.auto 6700
# Check pack limit (default is 50)
git config --get gc.autoPackLimit
git config gc.autoPackLimit 50This prevents the issue from recurring.
Threshold Calculation: Git doesn't count all loose objects directly. Instead, it samples the .git/objects/17/ directory and extrapolates. If that directory has more than (gc.auto + 255) / 256 objects (default ~27), it triggers gc. This optimization avoids scanning all 256 subdirectories.
Cruft Packs: Modern Git versions (2.37+) use "cruft packs" to store unreachable objects efficiently instead of leaving them as loose files. If you're seeing this warning frequently, consider upgrading Git and using git gc --cruft.
Large Repositories: For very large repos or those with frequent history rewrites, consider using git gc --aggressive occasionally (every few hundred changesets). This takes longer but provides better compression. Don't use it routinely as it's computationally expensive.
Concurrent Access: In shared environments (CI servers, NFS-mounted repos), loose object accumulation is common because git gc --auto may fail when other processes hold locks. Consider scheduling periodic manual git gc runs during maintenance windows.
Configuration Tuning: For high-velocity repositories, you might increase gc.auto to reduce gc frequency:
# Increase threshold to 10,000 loose objects
git config gc.auto 10000However, this trades memory/disk efficiency for fewer gc interruptions.
Prune Expiration: The default 2-week grace period (gc.pruneExpire) protects recent work from being deleted. You can adjust it:
# Set prune expiration to 1 week
git config gc.pruneExpire "1 week ago"git svn Users: When doing large git svn clone or git svn fetch operations, loose objects accumulate rapidly. Consider running git gc periodically during the operation or using --gc-threshold if available in your Git version.
warning: BOM detected in file, this may cause issues
UTF-8 Byte Order Mark (BOM) detected in file
fatal: Server does not support --shallow-exclude
Server does not support --shallow-exclude
warning: filtering out blobs larger than limit
Git partial clone filtering large blobs warning
fatal: Server does not support --shallow-since
Server does not support --shallow-since in Git
kex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server