This error occurs when Elasticsearch cannot parse a date value according to the field's mapping configuration. The date format in your document doesn't match the expected format defined in the index mapping, causing parsing failures during indexing or updates.
The "MapperParsingException: failed to parse field [field] of type [date]" error indicates that Elasticsearch encountered a date value it cannot interpret according to the field's mapping configuration. Date fields in Elasticsearch require specific formatting, and when a document contains a date string that doesn't match the expected pattern, this parsing exception occurs. This error typically happens when: 1. Your document contains date strings in a format different from the mapping's format 2. The date field mapping doesn't specify a format, and Elasticsearch uses default formats that don't match your data 3. You're sending numeric timestamps when the mapping expects string dates (or vice versa) 4. The date string contains invalid characters or is malformed 5. Timezone information is missing or incorrectly formatted Elasticsearch date fields are strict about format validation. When indexing fails due to date parsing issues, the entire document may be rejected, or the field may be indexed as null/missing.
First, examine how the date field is currently mapped in your index:
# Get the mapping for a specific index
curl -X GET "localhost:9200/my-index/_mapping" -u "username:password"
# Get mapping for a specific field
curl -X GET "localhost:9200/my-index/_mapping/field/timestamp" -u "username:password"
# Example response showing date field mapping:
# {
# "my-index": {
# "mappings": {
# "properties": {
# "timestamp": {
# "type": "date",
# "format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"
# }
# }
# }
# }
# }Look for the "format" property in the date field mapping. If no format is specified, Elasticsearch uses default formats that may not match your data.
If the current format doesn't match your data, update the mapping. Note: You cannot change mapping on existing fields, but you can add new fields or use reindexing:
# Create a new index with correct date format mapping
curl -X PUT "localhost:9200/my-index-v2" -u "username:password" -H 'Content-Type: application/json' -d'
{
"mappings": {
"properties": {
"timestamp": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"created_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
}
}
}
'
# Common date formats in Elasticsearch:
# - "strict_date_optional_time": ISO8601 with optional time (default)
# - "epoch_millis": Unix timestamp in milliseconds
# - "yyyy-MM-dd HH:mm:ss": Custom format
# - "yyyy/MM/dd HH:mm:ss||yyyy-MM-dd||epoch_millis": Multiple formats
# Reindex data from old index to new index
curl -X POST "localhost:9200/_reindex" -u "username:password" -H 'Content-Type: application/json' -d'
{
"source": {
"index": "my-index"
},
"dest": {
"index": "my-index-v2"
}
}
'Use multiple formats separated by "||" to accept different date representations.
If you cannot change the mapping, transform your date data to match the existing format:
# Example: Convert dates to ISO8601 format using jq
cat documents.json | jq '.timestamp = (.timestamp | strptime("%Y/%m/%d %H:%M:%S") | strftime("%Y-%m-%dT%H:%M:%SZ"))' > transformed.json
# Or convert epoch milliseconds to ISO format
cat documents.json | jq '.timestamp = (.timestamp | tonumber | . / 1000 | strftime("%Y-%m-%dT%H:%M:%SZ"))' > transformed.json# Python example: Transform dates before indexing
import json
from datetime import datetime
def transform_dates(document):
# Convert various date formats to ISO8601
if 'timestamp' in document:
date_str = document['timestamp']
try:
# Try parsing as ISO format
dt = datetime.fromisoformat(date_str.replace('Z', '+00:00'))
except ValueError:
try:
# Try parsing as custom format
dt = datetime.strptime(date_str, '%Y/%m/%d %H:%M:%S')
except ValueError:
try:
# Try parsing as epoch milliseconds
dt = datetime.fromtimestamp(int(date_str) / 1000)
except ValueError:
# Use current time as fallback
dt = datetime.now()
document['timestamp'] = dt.isoformat() + 'Z'
return document
# Process documents
with open('documents.json', 'r') as f:
documents = json.load(f)
transformed = [transform_dates(doc) for doc in documents]
with open('transformed.json', 'w') as f:
json.dump(transformed, f)For flexible date handling, use dynamic date detection or runtime fields:
# Enable dynamic date detection in index settings
curl -X PUT "localhost:9200/my-index" -u "username:password" -H 'Content-Type: application/json' -d'
{
"settings": {
"index": {
"mapping": {
"date_detection": true,
"dynamic_date_formats": ["yyyy-MM-dd HH:mm:ss", "yyyy/MM/dd"]
}
}
}
}
'
# Use runtime fields for on-the-fly date parsing
curl -X GET "localhost:9200/my-index/_search" -u "username:password" -H 'Content-Type: application/json' -d'
{
"runtime_mappings": {
"parsed_timestamp": {
"type": "date",
"script": {
"source": """
if (doc["timestamp"].size() == 0) {
emit(null);
} else {
try {
def dateStr = doc["timestamp"].value;
// Parse custom format
def formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
def date = LocalDateTime.parse(dateStr, formatter);
emit(date.toInstant(ZoneOffset.UTC).toEpochMilli());
} catch (Exception e) {
emit(null);
}
}
"""
}
}
},
"query": {
"range": {
"parsed_timestamp": {
"gte": "now-7d/d"
}
}
}
}
'Runtime fields allow querying on dates without changing the underlying data.
Add date validation before sending data to Elasticsearch:
# Python example: Date validation
import re
from datetime import datetime
def validate_date_for_elasticsearch(date_value, expected_format='iso'):
if date_value is None:
return False
if isinstance(date_value, (int, float)):
# Check if it's a reasonable timestamp (within 10 years)
import time
now = time.time() * 1000 # Current time in milliseconds
return 0 < date_value < now + (1000 * 60 * 60 * 24 * 365 * 10)
if isinstance(date_value, str):
try:
# Try ISO format
datetime.fromisoformat(date_value.replace('Z', '+00:00'))
return True
except ValueError:
# Try common formats
formats = [
'%Y-%m-%d %H:%M:%S',
'%Y/%m/%d %H:%M:%S',
'%Y-%m-%dT%H:%M:%SZ',
'%Y-%m-%d'
]
for fmt in formats:
try:
datetime.strptime(date_value, fmt)
return True
except ValueError:
continue
return False
return False
def safe_index_document(document, client, index_name):
validation_errors = []
# Validate date fields
if 'timestamp' in document and not validate_date_for_elasticsearch(document['timestamp']):
validation_errors.append('timestamp: Invalid date format')
if validation_errors:
print(f'Document validation failed: {validation_errors}')
# Use default value
safe_document = document.copy()
safe_document['timestamp'] = datetime.now().isoformat() + 'Z'
return client.index(index=index_name, body=safe_document)
return client.index(index=index_name, body=document)# Use Elasticsearch ingest pipeline for validation
curl -X PUT "localhost:9200/_ingest/pipeline/date-validator" -u "username:password" -H 'Content-Type: application/json' -d'
{
"processors": [
{
"date": {
"field": "timestamp",
"formats": ["yyyy-MM-dd HH:mm:ss", "yyyy/MM/dd HH:mm:ss", "epoch_millis"],
"target_field": "timestamp_parsed",
"on_failure": [
{
"set": {
"field": "validation_error",
"value": "Date parsing failed"
}
},
{
"set": {
"field": "timestamp",
"value": "{{_ingest.timestamp}}"
}
}
]
}
}
]
}
'
# Use the pipeline when indexing
curl -X POST "localhost:9200/my-index/_doc?pipeline=date-validator" -u "username:password" -H 'Content-Type: application/json' -d'
{
"timestamp": "2024/01/01 12:00:00",
"message": "Test document"
}
'## Advanced Date Handling in Elasticsearch
### Date Math and Relative Time
Elasticsearch supports date math expressions for flexible time-based queries:
# Query documents from last 7 days
curl -X GET "localhost:9200/my-index/_search" -u "username:password" -H 'Content-Type: application/json' -d'
{
"query": {
"range": {
"timestamp": {
"gte": "now-7d/d",
"lte": "now/d"
}
}
}
}
'
# Index documents with date math in index name (time-based indices)
curl -X PUT "localhost:9200/<logs-{now/d}>" -u "username:password"
# Creates index: logs-2024.01.01### Timezone Handling
Elasticsearch stores dates in UTC but can handle timezone conversions:
# Query with timezone conversion
curl -X GET "localhost:9200/my-index/_search" -u "username:password" -H 'Content-Type: application/json' -d'
{
"query": {
"range": {
"timestamp": {
"gte": "2024-01-01T00:00:00",
"lte": "2024-01-01T23:59:59",
"time_zone": "-05:00"
}
}
}
}
'
# Mapping with timezone
curl -X PUT "localhost:9200/my-index" -u "username:password" -H 'Content-Type: application/json' -d'
{
"mappings": {
"properties": {
"event_time": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss",
"time_zone": "America/New_York"
}
}
}
}
'### Multi-Field Dates for Different Formats
Create multi-fields to index the same date in multiple formats:
curl -X PUT "localhost:9200/my-index" -u "username:password" -H 'Content-Type: application/json' -d'
{
"mappings": {
"properties": {
"event_date": {
"type": "date",
"format": "yyyy-MM-dd",
"fields": {
"timestamp": {
"type": "date",
"format": "epoch_millis"
},
"iso": {
"type": "date",
"format": "strict_date_optional_time"
}
}
}
}
}
}
'### Performance Considerations
- Date fields with complex formats may impact indexing performance
- Multiple date formats in mapping increase index size
- Runtime fields have query-time performance cost
- Consider normalizing dates to a single format during ETL
### Monitoring and Alerting
Set up monitoring for date parsing failures:
# Check for failed documents in bulk responses
curl -X GET "localhost:9200/_cat/indices?v&h=index,docs.count,docs.deleted" -u "username:password"
# Monitor indexing errors in logs
grep "MapperParsingException" /var/log/elasticsearch/elasticsearch.log
# Use ingest pipeline to catch and log parsing errors
curl -X PUT "localhost:9200/_ingest/pipeline/date-validation" -u "username:password" -H 'Content-Type: application/json' -d'
{
"processors": [
{
"date": {
"field": "timestamp",
"formats": ["yyyy-MM-dd HH:mm:ss"],
"target_field": "timestamp_parsed",
"on_failure": [
{
"set": {
"field": "error",
"value": "Date parsing failed for field: {{ _ingest.on_failure_field }}"
}
}
]
}
}
]
}
'QueryShardException: No mapping found for [field] in order to sort on
How to fix "QueryShardException: No mapping found for field in order to sort on" in Elasticsearch
IndexNotFoundException: no such index [index_name]
How to fix "IndexNotFoundException: no such index [index_name]" in Elasticsearch
DocumentMissingException: [index][type][id]: document missing
DocumentMissingException: Document missing
ParsingException: Unknown key for a START_OBJECT in [query]
How to fix "ParsingException: Unknown key for a START_OBJECT in [query]" in Elasticsearch
AggregationExecutionException: Aggregation [agg_name] does not support sampling
How to fix "AggregationExecutionException: Aggregation [agg_name] does not support sampling" in Elasticsearch