Hi,
This is for v20.
Reference: https://web.archive.org/web/20240328094030/https://riggs.business/blog/f/postgresql-todo-2023
COPY enhancement:
Detect duplicate rows and redirect them to a separate table without
aborting the load.
While reviewing this TODO, I quickly noticed this idea closely aligns with
https://commitfest.postgresql.org/patch/4817.
Both ideas share common elements: allowing a user-specified table,
validating its metadata, and storing rows in it.
Based on that, I spent some time working on the implementation.
Proposed syntax:
COPY FROM (ON_CONFLICT TABLE, CONFLICT_TABLE conflict_tbl);
The CONFLICT_TABLE requires exactly four columns: COPY target table, COPY
filename, the line number of the duplicate, and the duplicate record itself.
This structure is fixed, a pre-defined data type is unnecessary. Validation is
based solely on the column data types (pg_attribute.atttypid) rather than their
names (pg_attribute.attname). The expected types are OID, TEXT, INT8, and TEXT,
respectively.
This uses INSERT ON CONFLICT infrastructure under the hood.
Demo:
CREATE TABLE t_copy_tbl(a int, b int, c text);
CREATE TABLE err_tbl1(copy_tbl oid, filename text, lineno bigint, line text);
CREATE UNIQUE INDEX ON t_copy_tbl (c);
COPY t_copy_tbl(b,a, c) FROM STDIN (DELIMITER ',', ON_CONFLICT TABLE,
CONFLICT_TABLE err_tbl1, log_verbosity verbose);
4,17,aaaaaa
6,11,aaaaaa
11,1,xxxxxxxx
12,1,xxxxxxxx
13,1,xxxxxxxx
\.
table err_tbl1 ;
copy_tbl | filename | lineno | line
----------+----------+--------+---------------
18231 | STDIN | 2 | 6,11,aaaaaa
18231 | STDIN | 4 | 12,1,xxxxxxxx
18231 | STDIN | 5 | 13,1,xxxxxxxx
(3 rows)
(I need to double-check the exclusion unique constraint)
Comments are welcome!
--
jian
https://www.enterprisedb.com/