PostgreSQL enforces unique keys in JSON objects, especially with jsonb and PostgreSQL 15+ SQL/JSON functions. This error occurs when you try to create or construct a JSON object with duplicate keys. Ensure all keys in your JSON objects are unique.
In PostgreSQL, JSON objects are collections of key-value pairs where each key must be unique. When you use functions like json_object, jsonb_object_agg, or construct JSON with duplicate keys, PostgreSQL raises an error to prevent invalid JSON structures. The behavior differs slightly depending on the JSON type you use. The json type (plain text) can technically store duplicate keys in the string representation, but newer SQL/JSON functions and the jsonb type (binary format) enforce strict key uniqueness. PostgreSQL 15 and later introduced the WITH UNIQUE KEY clause to validate JSON documents during construction, making this constraint explicit.
Review the function call or JSON construction to find which keys appear multiple times. For json_object(), check your argument list:
-- This FAILS - "name" appears twice
SELECT json_object(
'name', 'Alice',
'age', 30,
'name', 'Bob' -- duplicate key
);For aggregations, examine the source data:
-- This FAILS - duplicate keys from subquery
SELECT json_object_agg(key, value)
FROM (
SELECT 'id' as key, '1' as value
UNION ALL
SELECT 'id', '2' -- same key again
) data;When constructing JSON with json_object() or jsonb_object(), each key must appear only once:
-- CORRECT - all keys unique
SELECT json_object(
'name', 'Alice',
'age', 30,
'email', '[email protected]'
);
-- If you need multiple values, use arrays or nested objects
SELECT json_object(
'name', 'Alice',
'age', 30,
'aliases', json_array('Ali', 'Alicia') -- use array for multiple values
);When using json_object_agg() or json_object_agg_unique(), ensure the source data has unique keys per group:
-- FAILS - multiple rows per group with same key
SELECT user_id, json_object_agg(key, value)
FROM user_metadata
GROUP BY user_id;
-- CORRECT - use DISTINCT or filtering to ensure one key per row
SELECT user_id, json_object_agg(key, value)
FROM (
SELECT DISTINCT ON (user_id, key) user_id, key, value
FROM user_metadata
ORDER BY user_id, key, created_at DESC
) dedup
GROUP BY user_id;
-- Or use conditional aggregation to pick one value
SELECT user_id, json_object_agg(
key,
(array_agg(value ORDER BY created_at DESC))[1] -- take newest
)
FROM user_metadata
GROUP BY user_id;When building JSON from multiple sources, deduplicate key names across queries:
-- FAILS - both queries produce "status" key
SELECT json_object_agg(key, value)
FROM (
SELECT 'status' as key, 'active' as value
UNION ALL
SELECT 'status', 'verified' -- duplicate
) data;
-- CORRECT - prefix or distinguish keys
SELECT json_object_agg(key, value)
FROM (
SELECT 'account_status' as key, 'active' as value
UNION ALL
SELECT 'verification_status', 'verified'
) data;When migrating from json to jsonb, note that jsonb keeps only the last value for duplicate keys (no error thrown):
-- json type: keeps both values (but may cause issues with strict functions)
SELECT '{"name": "Alice", "name": "Bob"}'::json;
-- Result: {"name": "Alice", "name": "Bob"}
-- jsonb type: keeps only last value (automatic deduplication)
SELECT '{"name": "Alice", "name": "Bob"}'::jsonb;
-- Result: {"name": "Bob"}
-- Use jsonb_object() for safe construction
SELECT jsonb_object(
ARRAY['name', 'age'],
ARRAY['Alice', '30']
);PostgreSQL 16 introduced stricter validation with json_object_agg_unique() and jsonb_object_agg_unique(), which enforce key uniqueness more aggressively. If you're seeing this error with these functions on PG16+, ensure your source data truly has unique keys per group.
The difference between json and jsonb matters here: json is a text representation that can technically hold duplicate keys (violating JSON spec), while jsonb is a binary format that normalizes keys and keeps only the last value. For new applications, prefer jsonb and the SQL/JSON functions (json_object, json_array) over the older json_object_agg functions.
When working with nested JSON or complex aggregations, consider using jsonb_set() or jsonb_insert() instead of constructors if you need conditional merging of objects, as these functions handle key conflicts by overwriting rather than rejecting.
In PostgreSQL 15+, you can use WITH UNIQUE KEYS in JSON constructors to make uniqueness requirements explicit: SELECT json('{"a":1}' FORMAT JSON ENCODING UTF8 WITH UNIQUE KEYS);
ERROR: syntax error at end of input
Syntax error at end of input in PostgreSQL
Bind message supplies N parameters but prepared statement requires M
Bind message supplies N parameters but prepared statement requires M in PostgreSQL
Multidimensional arrays must have sub-arrays with matching dimensions
Multidimensional arrays must have sub-arrays with matching dimensions
ERROR: value too long for type character varying
Value too long for type character varying
insufficient columns in unique constraint for partition key
How to fix "insufficient columns in unique constraint for partition key" in PostgreSQL