This error occurs when AWS DynamoDB encounters a conflict during table import operations, typically when importing data from S3 or during restore operations. The conflict usually involves schema mismatches, existing data conflicts, or concurrent operations on the same table.
The ImportConflictException is thrown by AWS DynamoDB when there's a conflict during import operations. This typically happens when you're using DynamoDB's import feature to load data from Amazon S3 into a table, or during table restore operations. DynamoDB import operations are designed to be atomic and consistent, but conflicts can arise when: 1. The target table already contains data with the same primary key values as the import data 2. There are concurrent write operations happening on the table during import 3. The import job specification conflicts with the table's schema or settings 4. There are resource constraints or throttling issues during the import process This error indicates that DynamoDB cannot complete the import operation without potentially violating data consistency or schema constraints. The import job will fail, and you'll need to resolve the conflict before retrying.
First, examine the failed import job to understand the specific conflict:
# Using AWS CLI
aws dynamodb describe-import --import-arn arn:aws:dynamodb:region:account:table/table-name/import/import-id
# Check CloudWatch logs for the import job
aws logs get-log-events --log-group-name /aws/dynamodb/imports --log-stream-name import-job-id --limit 100Look for specific error messages about:
- Primary key conflicts
- Schema validation errors
- Concurrent operation details
- Resource limit exceeded messages
Primary key conflicts are the most common cause. Check if import data duplicates existing keys:
# Check for duplicate primary keys in import data
# Assuming you have the S3 data file
aws s3 cp s3://your-bucket/import-data.json ./import-data.json
# Use jq to extract primary keys (adjust for your key schema)
jq -r '.[].id' import-data.json | sort | uniq -d
# Compare with existing table keys
aws dynamodb scan --table-name your-table --projection-expression "id" --max-items 100 --output json | jq -r '.Items[].id.S' | sort > existing-keys.txt
# Find conflicts
comm -12 <(jq -r '.[].id' import-data.json | sort) existing-keys.txtSolutions:
1. Remove duplicate items from import data
2. Use UpdateItem behavior instead of PutItem
3. Create a new table for the import, then merge data
Ensure no other processes are writing to the table during import:
# Check for active connections/writes
aws cloudwatch get-metric-statistics --namespace AWS/DynamoDB --metric-name WriteThrottleEvents --dimensions Name=TableName,Value=your-table --start-time $(date -d '1 hour ago' +%s) --end-time $(date +%s) --period 300 --statistics Sum
# Check DynamoDB Streams (if enabled)
aws dynamodbstreams describe-stream --stream-arn arn:aws:dynamodb:region:account:table/table-name/stream/timestamp
# Temporarily disable write operations
# 1. Update application to pause writes
# 2. Use IAM policies to restrict access during import
# 3. Consider using maintenance windowsBest practice: Schedule imports during low-traffic periods or use a separate staging table.
Ensure import data matches table schema:
# Get table schema
aws dynamodb describe-table --table-name your-table --query 'Table.{KeySchema:KeySchema,AttributeDefinitions:AttributeDefinitions}'
# Check a sample of import data against schema
# Example for a table with partition key 'id' (String) and sort key 'timestamp' (Number)
jq '.[0]' import-data.json
# Validate data types match
# String attributes should be strings: {"S": "value"}
# Number attributes should be numbers: {"N": "123"}
# Binary attributes should be base64: {"B": "dGVzdA=="}Common schema issues:
1. Missing required attributes
2. Incorrect data types (String vs Number)
3. Attribute names with special characters
4. Nested structures exceeding DynamoDB limits
Configure import job to handle conflicts:
# Create new import job with conflict resolution
aws dynamodb import-table --s3-bucket-source BucketArn=arn:aws:s3:::your-bucket --s3-key-source Key=import-data.json --input-format DYNAMODB_JSON --table-creation-parameters "{
"TableName": "your-table",
"AttributeDefinitions": [
{"AttributeName": "id", "AttributeType": "S"}
],
"KeySchema": [
{"AttributeName": "id", "KeyType": "HASH"}
],
"BillingMode": "PAY_PER_REQUEST"
}" --input-compression-type GZIP --import-status ENABLED
# Or use update behavior for existing table
# Note: Standard import doesn't support update, consider:
# 1. Export existing data
# 2. Merge with import data locally
# 3. Import merged dataset to new table
# 4. Switch applications to new tableAlternative approaches:
1. Use AWS Data Pipeline for complex transformations
2. Use AWS Glue for ETL processing
3. Write custom Lambda function to handle conflicts
After starting import, monitor progress and validate results:
# Monitor import status
aws dynamodb describe-import --import-arn arn:aws:dynamodb:region:account:table/table-name/import/import-id --query 'ImportTableDescription.{Status:ImportStatus,StartTime:StartTime,EndTime:EndTime,ProcessedSize:ProcessedSize,ImportedItemCount:ImportedItemCount}'
# Validate import completed successfully
aws dynamodb describe-table --table-name your-table --query 'Table.ItemCount'
# Compare with expected count
expected_count=$(jq 'length' import-data.json)
echo "Expected: $expected_count items"
# Sample imported data
aws dynamodb scan --table-name your-table --limit 5 --query 'Items[0:5]'Post-import checks:
1. Verify item count matches expected
2. Check a sample of imported items
3. Validate secondary indexes are updated
4. Test application read/write operations
### Understanding Import Conflict Scenarios
1. Primary Key Conflicts:
DynamoDB requires unique primary keys. During import, if an item with the same partition key (and sort key if applicable) already exists, the import fails. Unlike batch operations, import doesn't support overwrite behavior by default.
2. Concurrent Modification Conflicts:
If your table has continuous write traffic (even a few writes per second), imports can fail. DynamoDB import operations require a consistent view of the table, which concurrent modifications can disrupt.
3. Schema Evolution Conflicts:
When importing to an existing table, the data must match the table's attribute definitions. If you've added new attributes or changed types since the backup was created, conflicts can occur.
4. Global Secondary Index Conflicts:
If your table has GSIs, imported data must satisfy GSI key schema requirements. Items missing GSI key attributes or with invalid types will cause import failures.
### Best Practices for Successful Imports
Pre-import Checklist:
1. Stop write traffic: Use maintenance windows or traffic routing to pause writes
2. Validate data: Check for duplicate keys, schema compliance, and data quality
3. Test with sample: Import a small subset first to identify issues
4. Monitor capacity: Ensure table has sufficient read/write capacity
Alternative Import Strategies:
AWS Data Pipeline: More flexible for complex transformations and conflict resolution
{
"pipeline": {
"activities": [
{
"id": "DynamoDBImport",
"type": "DynamoDBDataNode",
"schedule": {"ref": "DefaultSchedule"},
"input": {"ref": "S3DataNode"},
"output": {"ref": "DynamoDBDataNode"},
"runsOn": {"ref": "ResourceId"}
}
]
}
}Custom Lambda ETL: For maximum control over conflict resolution
exports.handler = async (event) => {
// Read from S3
// Check for conflicts
// Apply business logic (overwrite, skip, merge)
// Write to DynamoDB
};Performance Considerations:
- Large imports (>10GB) may take hours and consume significant read capacity
- Consider splitting large imports into multiple batches
- Use parallel imports to different tables, then merge
Cost Optimization:
- Use S3 Select to filter data before import
- Compress data (GZIP) to reduce S3 transfer costs
- Schedule imports during off-peak hours for reserved capacity tables
ResourceNotFoundException: Requested resource not found
How to fix "ResourceNotFoundException: Requested resource not found" in DynamoDB
TrimmedDataAccessException: The requested data has been trimmed
How to fix "TrimmedDataAccessException: The requested data has been trimmed" in DynamoDB Streams
GlobalTableNotFoundException: Global Table not found
How to fix "GlobalTableNotFoundException: Global Table not found" in DynamoDB
InvalidExportTimeException: The specified ExportTime is outside of the point in time recovery window
How to fix "InvalidExportTimeException: The specified ExportTime is outside of the point in time recovery window" in DynamoDB
MissingAuthenticationTokenException: Missing Authentication Token
How to fix "MissingAuthenticationTokenException: Missing Authentication Token" in DynamoDB