The "validation_failed: Input validation failed" error occurs when data being inserted or updated in Supabase fails to meet database constraints, schema validation rules, or data type requirements. This error typically indicates issues with data format, missing required fields, or violations of database constraints like foreign keys, unique constraints, or check constraints.
The "validation_failed: Input validation failed" error is a Supabase/PostgREST error that occurs when the data you're trying to insert or update doesn't pass database-level validation. This validation happens at multiple levels: 1. **Database Schema Constraints** - PostgreSQL enforces data integrity through: - Data type validation (e.g., trying to insert text into an integer column) - NOT NULL constraints (missing required fields) - CHECK constraints (business logic rules) - UNIQUE constraints (duplicate values in unique columns) - FOREIGN KEY constraints (referential integrity) 2. **PostgREST Validation** - The REST API layer validates: - JSON structure matches table schema - Required fields are present - Data types are compatible - Array/object structures are valid 3. **Supabase Client Validation** - The client libraries may perform additional validation before sending requests. This error is protective - it prevents invalid data from entering your database, which helps maintain data quality and application reliability. However, it can be frustrating when you're not sure which specific validation rule is being violated.
First, you need to determine which constraint or validation rule is failing. Since the error message is generic, you'll need to investigate:
-- Check table schema and constraints
SELECT
column_name,
data_type,
is_nullable,
column_default,
character_maximum_length
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'your_table_name'
ORDER BY ordinal_position;
-- Check constraints on the table
SELECT
tc.constraint_name,
tc.constraint_type,
kcu.column_name,
ccu.table_name AS foreign_table_name,
ccu.column_name AS foreign_column_name
FROM information_schema.table_constraints tc
LEFT JOIN information_schema.key_column_usage kcu
ON tc.constraint_name = kcu.constraint_name
AND tc.table_schema = kcu.table_schema
LEFT JOIN information_schema.constraint_column_usage ccu
ON ccu.constraint_name = tc.constraint_name
WHERE tc.table_schema = 'public'
AND tc.table_name = 'your_table_name';Common things to check:
1. NOT NULL columns - Are you providing values for all required columns?
2. Data types - Are you sending strings for integer columns or vice versa?
3. UNIQUE constraints - Are you trying to insert duplicate values?
4. FOREIGN KEY constraints - Do referenced records exist?
5. CHECK constraints - What business rules are defined?
Test with minimal data:
// Test with minimal valid data
const testData = {
// Only include required fields first
required_field: 'value'
};
const { data, error } = await supabase
.from('your_table')
.insert(testData);Data type issues are common causes of validation failures:
-- See exact data types for each column
SELECT
column_name,
data_type,
udt_name,
is_nullable
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'your_table_name';Common data type issues:
1. Integer columns receiving strings:
// Wrong: Sending string to integer column
const badData = { id: '123' }; // id is integer
// Right: Convert to number
const goodData = { id: parseInt('123', 10) };2. Date/time format issues:
// Wrong: Invalid date format
const badData = { created_at: '2023-13-45' };
// Right: ISO 8601 format
const goodData = { created_at: '2023-12-12T10:30:00Z' };
// Or use Date object
const goodData2 = { created_at: new Date().toISOString() };3. JSON/JSONB column issues:
// Wrong: Sending string instead of object
const badData = { metadata: '{"key": "value"}' };
// Right: Send actual object
const goodData = { metadata: { key: 'value' } };4. Array column issues:
// Wrong: Sending string instead of array
const badData = { tags: 'tag1,tag2' };
// Right: Send actual array
const goodData = { tags: ['tag1', 'tag2'] };Columns marked as NOT NULL without default values must be provided in every insert:
-- Find NOT NULL columns without defaults
SELECT column_name, data_type
FROM information_schema.columns
WHERE table_schema = 'public'
AND table_name = 'your_table_name'
AND is_nullable = 'NO'
AND column_default IS NULL;Solutions:
1. Provide values for all NOT NULL columns:
// Check your data includes all required fields
const requiredFields = ['id', 'user_id', 'created_at'];
const hasAllRequired = requiredFields.every(field => field in data);
if (!hasAllRequired) {
console.error('Missing required fields:', requiredFields.filter(f => !(f in data)));
}2. Add default values in database schema:
-- Add default value for created_at
ALTER TABLE your_table
ALTER COLUMN created_at SET DEFAULT now();
-- Add default value for status
ALTER TABLE your_table
ALTER COLUMN status SET DEFAULT 'pending';3. Use database triggers for automatic values:
-- Create trigger to set timestamps
CREATE OR REPLACE FUNCTION set_updated_at()
RETURNS TRIGGER AS $$
BEGIN
NEW.updated_at = now();
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER set_updated_at_trigger
BEFORE UPDATE ON your_table
FOR EACH ROW
EXECUTE FUNCTION set_updated_at();4. Modify schema to allow NULL if appropriate:
-- Make column nullable
ALTER TABLE your_table
ALTER COLUMN optional_field DROP NOT NULL;UNIQUE constraints prevent duplicate values in specific columns:
-- Find unique constraints
SELECT
tc.constraint_name,
kcu.column_name,
tc.constraint_type
FROM information_schema.table_constraints tc
JOIN information_schema.key_column_usage kcu
ON tc.constraint_name = kcu.constraint_name
WHERE tc.table_schema = 'public'
AND tc.table_name = 'your_table_name'
AND tc.constraint_type = 'UNIQUE';Handling strategies:
1. Check for existing records before insert:
// Before inserting, check if value already exists
const { data: existing } = await supabase
.from('users')
.select('email')
.eq('email', newUser.email)
.single();
if (existing) {
// Handle duplicate - update instead of insert
const { data } = await supabase
.from('users')
.update(newUser)
.eq('email', newUser.email);
} else {
// Safe to insert
const { data } = await supabase
.from('users')
.insert(newUser);
}2. Use UPSERT (insert or update):
// Supabase handles duplicates automatically with upsert
const { data, error } = await supabase
.from('users')
.upsert(newUser, {
onConflict: 'email' // Column with unique constraint
});3. Generate unique values:
// For columns like usernames, generate unique versions
async function getUniqueUsername(baseUsername) {
let username = baseUsername;
let counter = 1;
while (true) {
const { data: existing } = await supabase
.from('users')
.select('username')
.eq('username', username)
.single();
if (!existing) return username;
username = baseUsername + counter;
counter++;
}
}4. Remove or modify unique constraint if business logic allows:
-- Remove unique constraint
ALTER TABLE your_table
DROP CONSTRAINT constraint_name;
-- Or create partial unique constraint
CREATE UNIQUE INDEX unique_active_email
ON users(email)
WHERE status = 'active';Foreign key constraints ensure referential integrity between tables:
-- Find foreign key constraints
SELECT
tc.constraint_name,
kcu.column_name AS foreign_key_column,
ccu.table_name AS referenced_table,
ccu.column_name AS referenced_column
FROM information_schema.table_constraints tc
JOIN information_schema.key_column_usage kcu
ON tc.constraint_name = kcu.constraint_name
JOIN information_schema.constraint_column_usage ccu
ON ccu.constraint_name = tc.constraint_name
WHERE tc.table_schema = 'public'
AND tc.table_name = 'your_table_name'
AND tc.constraint_type = 'FOREIGN KEY';Solutions:
1. Ensure referenced record exists:
// Check if referenced record exists before insert
const { data: referencedRecord } = await supabase
.from('referenced_table')
.select('id')
.eq('id', foreignKeyValue)
.single();
if (!referencedRecord) {
// Create the referenced record first
const { data: newRef } = await supabase
.from('referenced_table')
.insert({ id: foreignKeyValue, ...otherData });
}2. Use database transactions:
// Use Supabase's transaction support
const { data, error } = await supabase.rpc('create_with_references', {
main_data: mainRecord,
ref_data: referencedRecord
});
// Or implement client-side transaction logic
async function insertWithReferences(mainData, refData) {
// Insert reference first
const { data: ref, error: refError } = await supabase
.from('referenced_table')
.insert(refData);
if (refError) throw refError;
// Then insert main record with foreign key
const { data: main, error: mainError } = await supabase
.from('main_table')
.insert({
...mainData,
ref_id: ref.id
});
if (mainError) {
// Rollback: delete the reference we just created
await supabase.from('referenced_table').delete().eq('id', ref.id);
throw mainError;
}
return { main, ref };
}3. Modify foreign key constraint:
-- Make foreign key nullable
ALTER TABLE your_table
ALTER COLUMN foreign_key_column DROP NOT NULL;
-- Add ON DELETE CASCADE or SET NULL
ALTER TABLE your_table
DROP CONSTRAINT fk_constraint_name;
ALTER TABLE your_table
ADD CONSTRAINT fk_constraint_name
FOREIGN KEY (foreign_key_column)
REFERENCES referenced_table(id)
ON DELETE SET NULL;Implement client-side validation to catch issues before they reach the database:
// Comprehensive validation function
function validateTableData(tableName, data) {
const validations = {
users: (data) => {
const errors = [];
// Required fields
if (!data.email) errors.push('Email is required');
if (!data.name) errors.push('Name is required');
// Data type validation
if (data.email && typeof data.email !== 'string') {
errors.push('Email must be a string');
}
if (data.age && (typeof data.age !== 'number' || data.age < 0)) {
errors.push('Age must be a positive number');
}
// Format validation
if (data.email && !isValidEmail(data.email)) {
errors.push('Invalid email format');
}
// Length validation
if (data.name && data.name.length > 100) {
errors.push('Name must be 100 characters or less');
}
return errors;
},
products: (data) => {
const errors = [];
if (!data.name) errors.push('Product name is required');
if (!data.price || data.price < 0) errors.push('Price must be positive');
if (data.stock !== undefined && !Number.isInteger(data.stock)) {
errors.push('Stock must be an integer');
}
return errors;
}
};
const validator = validations[tableName];
return validator ? validator(data) : [];
}
// Usage
const errors = validateTableData('users', userData);
if (errors.length > 0) {
console.error('Validation errors:', errors);
return;
}
// Sanitize data before sending
function sanitizeTableData(tableName, data) {
const sanitizers = {
users: (data) => ({
...data,
email: data.email?.trim().toLowerCase(),
name: data.name?.trim(),
age: data.age ? parseInt(data.age, 10) : null
}),
products: (data) => ({
...data,
name: data.name?.trim(),
price: parseFloat(data.price),
stock: data.stock ? parseInt(data.stock, 10) : null
})
};
const sanitizer = sanitizers[tableName];
return sanitizer ? sanitizer(data) : data;
}
// Use validated and sanitized data
const cleanData = sanitizeTableData('users', userData);
const { data, error } = await supabase
.from('users')
.insert(cleanData);Also consider using validation libraries like Zod:
import { z } from 'zod';
const userSchema = z.object({
email: z.string().email(),
name: z.string().min(1).max(100),
age: z.number().int().positive().optional()
});
try {
const validatedData = userSchema.parse(userData);
// Data is valid, proceed with insert
} catch (error) {
console.error('Validation failed:', error.errors);
}## Understanding PostgreSQL Validation Layers
Supabase validation errors can originate from multiple layers:
### 1. PostgreSQL Native Validation
- Data Types: PostgreSQL is strict about data types
- Constraints: NOT NULL, CHECK, UNIQUE, FOREIGN KEY
- Triggers: BEFORE triggers can reject operations
- Domains: Custom data types with validation rules
### 2. PostgREST Validation
- JSON Schema Validation: Ensures request body matches table schema
- Content-Type: Must be application/json
- Required Fields: Validates against NOT NULL columns
- Type Coercion: Attempts limited type conversion
### 3. Common Validation Patterns
#### CHECK Constraints Examples:
-- Age must be positive
ALTER TABLE users ADD CONSTRAINT age_positive CHECK (age > 0);
-- Email must contain @
ALTER TABLE users ADD CONSTRAINT valid_email CHECK (email LIKE '%@%');
-- Status must be in allowed values
ALTER TABLE orders ADD CONSTRAINT valid_status
CHECK (status IN ('pending', 'processing', 'shipped', 'delivered'));#### Composite UNIQUE Constraints:
-- Unique combination of columns
ALTER TABLE user_preferences
ADD CONSTRAINT unique_user_preference
UNIQUE (user_id, preference_type);#### Exclusion Constraints (advanced):
-- Prevent overlapping time ranges
ALTER TABLE bookings
ADD EXCLUDE USING gist (
room_id WITH =,
tsrange(start_time, end_time) WITH &&
);### 4. Debugging Techniques
#### Enable Detailed Error Messages:
-- Get more detailed constraint violation errors
SET client_min_messages TO DEBUG;#### Log Failed Statements:
-- Log all statements that cause errors
ALTER DATABASE your_db SET log_min_error_statement = ERROR;#### Use PostgreSQL's Error Codes:
-- Catch specific error codes in PL/pgSQL
BEGIN
INSERT INTO table ...;
EXCEPTION
WHEN unique_violation THEN
RAISE NOTICE 'Duplicate value';
WHEN check_violation THEN
RAISE NOTICE 'Check constraint failed';
WHEN not_null_violation THEN
RAISE NOTICE 'NULL value in NOT NULL column';
END;### 5. Performance Considerations
- Constraint Checking Overhead: Each constraint adds validation overhead
- Index Usage: UNIQUE constraints create indexes (can help performance)
- Foreign Key Cascades: Can cause unexpected deletions if not careful
- Trigger Performance: Complex BEFORE triggers can slow down inserts
### 6. Best Practices
1. Validate Early: Catch issues in application code before database
2. Use Database Defaults: Set sensible defaults for common columns
3. Document Constraints: Keep schema documentation up to date
4. Test Edge Cases: Test with boundary values and invalid data
5. Monitor Constraint Violations: Log and analyze common validation failures
### 7. Migration Strategies
When changing constraints on production data:
-- 1. Add constraint with NOT VALID (skip existing data validation)
ALTER TABLE table_name
ADD CONSTRAINT constraint_name CHECK (condition) NOT VALID;
-- 2. Validate existing data in background
ALTER TABLE table_name VALIDATE CONSTRAINT constraint_name;
-- 3. Fix invalid data before validation
UPDATE table_name SET column = default_value WHERE condition IS FALSE;### 8. Supabase-Specific Considerations
- RLS Policies: Can interact with validation in complex ways
- Edge Functions: Can perform pre-validation before database
- Realtime: Validation failures won't trigger realtime updates
- Storage: Separate validation rules for file uploads
email_conflict_identity_not_deletable: Cannot delete identity because of email conflict
How to fix "Cannot delete identity because of email conflict" in Supabase
mfa_challenge_expired: MFA challenge has expired
How to fix "mfa_challenge_expired: MFA challenge has expired" in Supabase
conflict: Database conflict, usually related to concurrent requests
How to fix "database conflict usually related to concurrent requests" in Supabase
phone_exists: Phone number already exists
How to fix "phone_exists" in Supabase
StorageApiError: resource_already_exists
StorageApiError: Resource already exists