The "malformed array literal" error in PostgreSQL occurs when array values are provided in incorrect syntax. PostgreSQL requires arrays to be enclosed in curly braces with proper comma separation and quoted elements. Common causes include using JSON syntax instead of PostgreSQL syntax, missing quotes around special characters, and data type mismatches. Fixing requires proper array formatting according to PostgreSQL conventions.
PostgreSQL raises a "malformed array literal" error when it cannot parse an array value you have provided. Arrays in PostgreSQL store multiple values of the same data type in a single column, but they must follow strict syntax rules. PostgreSQL array syntax requires: - Curly braces {} to delimit the array (not square brackets [] like JSON) - Commas to separate elements - Double quotes around string elements containing spaces, special characters, or the delimiter - Proper escaping of reserved characters within quoted strings The error occurs during parsing before the query is executed, indicating that the array format cannot be interpreted. This is distinct from type mismatch errors (which occur at runtime) or other constraint violations. Common scenarios include: passing JSON array syntax `[1, 2, 3]` instead of PostgreSQL syntax `{1, 2, 3}`, unquoted strings with special characters, missing commas between elements, or improper element escaping.
PostgreSQL arrays use curly braces {}, not square brackets [] (which are JSON syntax).
-- INCORRECT: JSON syntax
INSERT INTO table_name (array_column) VALUES ([1, 2, 3]);
-- CORRECT: PostgreSQL array syntax
INSERT INTO table_name (array_column) VALUES ('{1, 2, 3}');
-- CORRECT: Alternative with ARRAY constructor
INSERT INTO table_name (array_column) VALUES (ARRAY[1, 2, 3]);Both styles work in PostgreSQL - the ARRAY constructor is more explicit and often preferred in application code.
String array elements containing spaces, commas, or special characters must be enclosed in double quotes.
-- INCORRECT: Unquoted strings with spaces
INSERT INTO tags (names) VALUES ('{hello world, foo bar}');
-- CORRECT: Double-quoted strings
INSERT INTO tags (names) VALUES ('{"hello world", "foo bar"}');
-- INCORRECT: Empty strings unquoted
INSERT INTO data (values) VALUES ('{hello, , world}');
-- CORRECT: Empty strings quoted
INSERT INTO data (values) VALUES ('{hello, "", world}');Note the difference: single quotes delimit the array literal as a SQL string, while double quotes are PostgreSQL's way of quoting individual elements.
Inside quoted array elements, backslashes and double quotes must be escaped with backslashes.
-- INCORRECT: Unescaped quotes inside element
INSERT INTO errors (messages) VALUES ('{"error: unknown", "warning"}');
-- CORRECT: Escape quotes inside elements
INSERT INTO errors (messages) VALUES ('{\"error: unknown\", \"warning\"}');
-- Backslashes must also be escaped
-- INCORRECT: Backslashes unescaped
INSERT INTO paths (dirs) VALUES ('{\"C:\\Users\\Admin\"}');
-- CORRECT: Properly escaped
INSERT INTO paths (dirs) VALUES ('{\"C:\\\\Users\\\\Admin\"}');When using the ARRAY constructor, escaping is simpler because you're using normal SQL string escaping:
The ARRAY constructor is often clearer when building arrays programmatically and avoids manual quoting/escaping.
-- Using ARRAY constructor
INSERT INTO tags (names) VALUES (ARRAY['hello world', 'foo bar', 'baz']);
-- Equivalent to literal syntax
INSERT INTO tags (names) VALUES ('{"hello world", "foo bar", "baz"}');
-- With NULL values
INSERT INTO scores (values) VALUES (ARRAY[100, NULL, 50]);
-- Casting elements to correct type
INSERT INTO ids (values) VALUES (ARRAY['123'::integer, '456'::integer]);The ARRAY constructor automatically handles quoting and escaping, making it safer and more readable in application code.
All array elements must be of the same type as the array column definition.
-- Define array of integers
CREATE TABLE scores (values integer[]);
-- INCORRECT: Text values in integer array
INSERT INTO scores (values) VALUES ('{1, 2, abc}'); -- ERROR
-- CORRECT: Cast to proper type
INSERT INTO scores (values) VALUES (ARRAY['1'::integer, '2'::integer, '3'::integer]);
-- INCORRECT: Mixed types
INSERT INTO items (tags) VALUES (ARRAY[1, 'two', 3]);
-- CORRECT: All same type
INSERT INTO items (tags) VALUES (ARRAY['one', 'two', 'three']);Use EXPLAIN to verify types are correct before running INSERT/UPDATE on large tables.
Most database drivers have built-in support for properly formatting arrays. Ensure you're using it correctly.
// Node.js pg driver - CORRECT: Pass native array
const tags = ['javascript', 'nodejs', 'typescript'];
await client.query(
'INSERT INTO articles (tags) VALUES ($1)',
[tags] // Driver handles array formatting
);
// INCORRECT: Manually formatting can cause errors
await client.query(
'INSERT INTO articles (tags) VALUES ($1)',
[JSON.stringify(tags)] // Wrong: JSON format
);# Python psycopg2 - CORRECT: Pass native list
import psycopg2.extras
tags = ['python', 'database', 'postgresql']
cur.execute(
"INSERT INTO articles (tags) VALUES (%s)",
(tags,) # Driver converts to PostgreSQL array syntax
)# Ruby pg gem - CORRECT: Pass native array
tags = ['ruby', 'rails', 'activerecord']
conn.exec_params(
"INSERT INTO articles (tags) VALUES ($1)",
[tags] # Driver handles conversion
)Always use parameterized queries with native array types. Never manually concatenate array strings.
Use psql interactive shell to test array formatting before running application code.
psql -U username -d database_name-- Test your array syntax directly
SELECT '{1, 2, 3}'::integer[];
SELECT ARRAY['hello', 'world'];
SELECT '{"hello world", "foo bar"}'::text[];
-- Check what your application sends (often visible in logs)
-- Look for patterns like: malformed array literal "[..." or similar
-- Test with parameterized value (simulates driver behavior)
PREPARE test_array AS SELECT $1::text[];
EXECUTE test_array(ARRAY['one', 'two']);
DEALLOCATE test_array;If the test passes in psql but fails in your application, the issue is in how your driver or code is formatting the array. Enable query logging to see the exact SQL being sent.
NULL values within arrays must be handled carefully depending on your use case.
-- INCORRECT: NULL not properly represented
INSERT INTO values_table (data) VALUES ('{"a", , "c"}'); -- Empty spot is not NULL
-- CORRECT: Use NULL keyword explicitly
INSERT INTO values_table (data) VALUES ('{"a", NULL, "c"}'::text[]);
-- Or with ARRAY constructor
INSERT INTO values_table (data) VALUES (ARRAY['a', NULL, 'c']);
-- To allow NULL array column itself (not element NULL)
CREATE TABLE items (tags text[] NULL);
INSERT INTO items (tags) VALUES (NULL); -- Entire array is NULL
-- To disallow NULL in array column
CREATE TABLE items (tags text[] NOT NULL);
INSERT INTO items (tags) VALUES (ARRAY[]::text[]); -- Empty array, not NULLDistinguish between NULL array elements and NULL array columns - the fix depends on which you need.
For multidimensional arrays, ensure dimensions match consistently:
-- 2D array must be rectangular
INSERT INTO matrix (data) VALUES ('{{1, 2}, {3, 4}}'::integer[][]);
-- INCORRECT: Jagged array (different row lengths)
INSERT INTO matrix (data) VALUES ('{{1, 2}, {3}}'::integer[][]);When working with arrays generated from JSON (e.g., parsing user input), be careful about type conversion. JSON arrays are not PostgreSQL arrays - always convert explicitly:
// Converting JSON array to PostgreSQL array format
const jsonData = [1, 2, 3];
// Send as native array, not JSON string
await db.query("INSERT INTO numbers VALUES ($1)", [jsonData]);For large arrays or performance-critical code, consider storing arrays in separate junction tables instead. This is especially true if you need to query individual array elements frequently:
-- Instead of storing {1, 2, 3, 4, 5} in user_ids column
-- Create a mapping table
CREATE TABLE post_authors (
post_id integer REFERENCES posts(id),
user_id integer REFERENCES users(id),
PRIMARY KEY (post_id, user_id)
);Use unnest() to query array elements:
SELECT * FROM articles WHERE 'javascript' = ANY(tags);
SELECT unnest(tags) as tag, count(*) FROM articles GROUP BY tag;When migrating from other databases (MySQL with JSON, SQLite), pay special attention to array syntax differences - what works in JSON or other array formats won't work in PostgreSQL without conversion.
ERROR: syntax error at end of input
Syntax error at end of input in PostgreSQL
Bind message supplies N parameters but prepared statement requires M
Bind message supplies N parameters but prepared statement requires M in PostgreSQL
Multidimensional arrays must have sub-arrays with matching dimensions
Multidimensional arrays must have sub-arrays with matching dimensions
ERROR: value too long for type character varying
Value too long for type character varying
insufficient columns in unique constraint for partition key
How to fix "insufficient columns in unique constraint for partition key" in PostgreSQL