DynamoDB returns ImportNotFoundException when you try to access or describe an import job that does not exist, has been deleted, or is in a different region. This error occurs with import operations like importing data from S3 into DynamoDB tables when referencing imports by their Amazon Resource Name (ARN) or import ID.
The ImportNotFoundException error in DynamoDB indicates that the import job you're trying to access cannot be found in the current AWS region or account. This error typically occurs in several scenarios: 1. **Import deleted**: The import job has been completed and automatically deleted after processing 2. **Wrong region**: You're trying to access an import in a different AWS region than where it was created 3. **Invalid ARN**: The Amazon Resource Name (ARN) provided doesn't match any existing import job 4. **Import ID mismatch**: The import ID doesn't correspond to any known import in your account 5. **Cross-account access**: Trying to access an import from a different AWS account without proper permissions 6. **Import never existed**: The import job was never successfully created or failed during creation DynamoDB imports are used for bulk data loading from S3 into tables. Each import has a unique ARN and exists only for the duration of the import operation and a short period after completion for status tracking.
First, ensure you're using the correct import ARN and AWS region.
Use the AWS CLI to list imports in the current region:
aws dynamodb list-importsCheck specific import details:
aws dynamodb describe-import --import-arn "arn:aws:dynamodb:us-east-1:123456789012:table/MyTable/import/016123456789-abcdefg"Verify the ARN format matches:
- arn:aws:dynamodb:{region}:{account}:table/{table-name}/import/{import-id}
- Ensure you're in the same region where the import was created
- Check account ID matches your current AWS account
Imports have specific lifecycle states. Check if the import still exists and its status.
Get detailed import information:
aws dynamodb describe-import --import-arn "arn:aws:dynamodb:us-east-1:123456789012:table/MyTable/import/016123456789-abcdefg" --query 'ImportTableDescription.[ImportArn, ImportStatus, StartTime, EndTime]'Import lifecycle states:
1. IN_PROGRESS: Import is actively processing data
2. COMPLETED: Import finished successfully
3. FAILED: Import processing failed
4. CANCELLED: Import was manually cancelled
5. DELETED: Import metadata was cleaned up
Import metadata is typically cleaned up shortly after completion (minutes to hours).
Implement proper error handling in your application code.
Basic error handling pattern:
try {
// Attempt to describe import
const response = await dynamodb.describeImport({ ImportArn: importArn }).promise();
return response.ImportTableDescription.ImportStatus;
} catch (error) {
if (error.code === 'ImportNotFoundException') {
console.log('Import not found, implementing fallback logic...');
// Your fallback logic here
return 'NOT_FOUND';
}
throw error;
}Best practices:
- Always wrap import operations in try-catch blocks
- Implement retry logic with exponential backoff
- Cache import ARNs with appropriate TTL
- Use idempotent operations when creating new imports
Implement monitoring to track import creation, progress, and completion.
CloudWatch alarm for import failures:
aws cloudwatch put-metric-alarm \
--alarm-name "DynamoDB-Import-Failures" \
--metric-name "UserErrors" \
--namespace "AWS/DynamoDB" \
--statistic "Sum" \
--period 300 \
--evaluation-periods 1 \
--threshold 1 \
--comparison-operator "GreaterThanOrEqualToThreshold" \
--alarm-actions "arn:aws:sns:us-east-1:123456789012:MyAlertsTopic"Monitoring strategy:
- Track import ARNs in a durable store (DynamoDB, RDS)
- Set up CloudWatch alarms for import failures
- Log import lifecycle events for audit trails
- Implement alerting for critical import failures
Build robust import validation and automatic recovery mechanisms.
Recovery checklist:
1. Check if source data still exists in S3
2. Verify target table status and permissions
3. Validate import parameters and configuration
4. Implement idempotent import creation
5. Maintain import metadata in a separate tracking table
Basic recovery flow:
async function recoverImport(importArn, tableName, s3Source) {
// 1. Check S3 source exists
// 2. Check table exists
// 3. Create new import if conditions met
// 4. Update tracking records
}Implement proactive measures to prevent ImportNotFoundException.
Prevention strategies:
- Implement durable import tracking with TTL
- Cache import ARNs with verification before use
- Use Infrastructure as Code to manage import lifecycle
- Schedule regular import status validation
- Implement import versioning for data pipeline resilience
Infrastructure as Code example (CloudFormation snippet):
Resources:
ImportTrackingTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: ImportTracking
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: ImportId
AttributeType: S
KeySchema:
- AttributeName: ImportId
KeyType: HASH
TimeToLiveSpecification:
AttributeName: ExpiresAt
Enabled: true## DynamoDB Import Deep Dive
### Import Architecture and Limitations:
Import Types:
1. ImportTable: Bulk import from S3 into new or existing DynamoDB tables
2. S3 Sources: Supports DynamoDB JSON, CSV, and Amazon Ion formats
3. Incremental Imports: Not natively supported - must implement custom logic
4. Streaming Imports: Use Kinesis Data Streams for real-time data ingestion
Import Lifecycle and Cleanup:
- Import metadata is temporary and cleaned up after completion
- Failed imports may retain metadata for troubleshooting
- S3 source data persists independently - manage S3 lifecycle separately
- Import operations are asynchronous and can take hours for large datasets
### Cross-Region and Cross-Account Imports:
Cross-Region Challenges:
- Imports are region-specific (ARN includes region code)
- S3 source buckets must be accessible from the DynamoDB region
- Consider data transfer costs for cross-region imports
- Use S3 Cross-Region Replication if source data is in different region
Cross-Account Access:
- Import ARNs include account ID
- IAM policies must allow cross-account describe/import permissions
- S3 bucket policies must allow cross-account read access for source data
- KMS key policies must allow cross-account access if using encrypted data
### Alternative Approaches:
When imports don't meet requirements:
1. AWS Data Pipeline: Scheduled data movement with transformation capabilities
2. AWS Glue: ETL jobs with schema discovery and data cleansing
3. AWS DMS (Database Migration Service): Continuous data replication
4. Custom Lambda functions: Event-driven data processing from S3
5. Third-party tools: Striim, Fivetran, etc. for complex data pipelines
### Security Considerations:
IAM Best Practices:
- Principle of least privilege for import operations
- Use IAM conditions to restrict imports to specific tables and S3 buckets
- Implement SCPs (Service Control Policies) for organization-wide import controls
- Rotate IAM credentials regularly, especially for service accounts
Data Protection:
- Enable S3 encryption for source data at rest
- Use KMS keys for additional encryption control of sensitive data
- Implement VPC endpoints for private S3 and DynamoDB access
- Audit import data access with S3 and CloudTrail logging
- Validate data integrity after import completion
ImportConflictException: There was a conflict when attempting to import to the table
How to fix 'ImportConflictException: There was a conflict when attempting to import to the table' in DynamoDB
ResourceNotFoundException: Requested resource not found
How to fix "ResourceNotFoundException: Requested resource not found" in DynamoDB
TrimmedDataAccessException: The requested data has been trimmed
How to fix "TrimmedDataAccessException: The requested data has been trimmed" in DynamoDB Streams
GlobalTableNotFoundException: Global Table not found
How to fix "GlobalTableNotFoundException: Global Table not found" in DynamoDB
InvalidExportTimeException: The specified ExportTime is outside of the point in time recovery window
How to fix "InvalidExportTimeException: The specified ExportTime is outside of the point in time recovery window" in DynamoDB