This warning appears when using git clone --filter=blob:limit with large binary files. It indicates blobs exceeding the specified size limit are being excluded from the partial clone.
This warning occurs when you use Git's partial clone feature with a blob size filter (--filter=blob:limit=<size>). Git partial clones allow you to download only a subset of repository objects, which is particularly useful for large repositories with many large binary files. When you specify a blob size limit, Git will exclude any file contents (blobs) that exceed that size from the initial clone. This warning is informational - it's Git confirming that it's successfully filtering out large blobs as requested. The filtered blobs will be downloaded on-demand when you actually need to access those files through operations like checkout, blame, or diff. This is part of Git's partial clone protocol (protocol v2), which requires server-side support from your Git hosting provider. The feature helps developers work with large repositories more efficiently by reducing initial clone times and disk usage.
This warning confirms Git is working as intended. If you specified a blob filter, this message indicates it's successfully excluding large files. No action is needed unless you encounter actual errors.
Check what filter you applied:
git config --local --get remote.origin.partialclonefilterThis will show your active filter, such as "blob:limit=1m" or "blob:none".
Confirm your repository is set up as a partial clone:
# Check if partial clone is enabled
git config --local --get remote.origin.promisor
# Output: true (if enabled)
# View your filter specification
git config --local --get remote.origin.partialclonefilter
# Example output: blob:limit=1048576You can also check the .git/config file directly:
cat .git/config | grep -A 5 'remote "origin"'If you need to work with a filtered-out large file, Git will automatically download it when you access it. To manually fetch specific blobs:
# Checkout a file to trigger blob download
git checkout HEAD path/to/large-file.bin
# Or fetch all missing blobs for current HEAD
git lfs fetch --all # If using Git LFSFor blob-filtered repositories without LFS:
# Fetch missing objects for current branch
git fetch --filter=blob:none originIf you later decide you need all repository objects, you can convert your partial clone to a full clone:
# Remove the partial clone filter
git config --local --unset remote.origin.promisor
git config --local --unset remote.origin.partialclonefilter
# Fetch all missing objects
git fetch --unshallow --filter=blob:none origin
git rev-list --all --objects | cut -d ' ' -f 1 | git cat-file --batch-check | grep blob | cut -d ' ' -f 1 | xargs git cat-file --batch > /dev/nullOr simply clone the repository again without the filter:
git clone <repository-url> full-cloneIf you're seeing this warning but didn't intend to filter blobs, check your global Git configuration:
# Check global partial clone settings
git config --global --get-regexp 'clone.filter'
git config --global --get-regexp 'fetch.filter'
# Remove unwanted global filter
git config --global --unset clone.filterSubmodules
git config --global --unset fetch.writeCommitGraphFor new clones without filtering:
git clone <repository-url> # No filter flagsProtocol and Server Support
Partial clone filtering requires Git protocol v2, which is supported by GitHub, GitLab, Bitbucket, Azure DevOps, and Gitea (1.17+). Older Git servers or clients (pre-2.25) may not support this feature. If the server doesn't support filtering, Git will fall back to a full clone and may display "filtering not recognized by server, ignoring".
Filter Specifications
Different filter types serve different use cases:
- --filter=blob:none: Excludes all file contents (best for developers who need commit history)
- --filter=blob:limit=<size>: Excludes files larger than specified size (e.g., "1m", "10k")
- --filter=tree:0: Treeless clone (downloads commits only, fetches trees and blobs on-demand)
- --filter=combine:blob:none+tree:0: Combines multiple filters
Performance Tradeoffs
While partial clones significantly reduce initial clone time and disk usage, they have tradeoffs:
- Single-blob operations like git blame become slower as each blob is fetched individually
- No delta compression across on-demand fetched objects
- Network-dependent performance for file access
- Build systems may need to prefetch required blobs
For CI/CD environments that don't modify code, --filter=blob:none with --depth=1 provides the fastest clone times.
Git LFS vs Partial Clone
Git LFS (Large File Storage) and partial clone solve similar problems but work differently:
- LFS requires explicit file tracking and server-side LFS storage
- Partial clone works with any repository without configuration changes
- LFS is better for repositories with many large binaries
- Partial clone is simpler and works with standard Git servers
Many teams use both: LFS for tracked large assets and partial clone for historical objects.
warning: BOM detected in file, this may cause issues
UTF-8 Byte Order Mark (BOM) detected in file
fatal: Server does not support --shallow-exclude
Server does not support --shallow-exclude
fatal: Server does not support --shallow-since
Server does not support --shallow-since in Git
kex_exchange_identification: Connection closed by remote host
Connection closed by remote host when connecting to Git server
fatal: unable to access: Proxy auto-configuration failed
How to fix 'Proxy auto-configuration failed' in Git